Signway Logo

Bring the power of pre-signed URLs to your LLM apps

Signway simple scheme

No data streaming in serverless architectures.

  • If you are using a serverless architecture for your LLM app, you might have found that streaming data from APIs like OpenAI's back to the end-user is either impossible or expensive.
  • One "option" is to wait for the LLM api to return the full response and then send it back to the end-user, but then you are paying for a long running and almost idle serverless function, and the end-user has a less than ideal experience.
Signway before scheme
Signway after scheme

Leverage IO streaming tasks to whom is designed for it.

  • You can leverage data streaming tasks to Signway. Your signed requests, if authentic and not expired, will be proxy-ed to the destination API directly from the user's device.
  • The end-user will receive the streaming response directly on his device, and your serverless function only needed to create a pre-signed url for the end-user to use, no data streaming involved.

Pay as you go model

Free tier

  • Fully managed

  • 10 requests/second

  • 1Mb/request data transfer

  • 1 Application

  • On demand

  • Fully managed

  • 1000 requests/second

  • 100 Mb/request data transfer

  • 10 Applications

  • 0.009 $/Mb transferred

  • Join the waitlist