Bring the power of pre-signed URLs to your LLM apps
No data streaming in serverless architectures.
- If you are using a serverless architecture for your LLM app, you might have found that streaming data from APIs like OpenAI's back to the end-user is either impossible or expensive.
- One "option" is to wait for the LLM api to return the full response and then send it back to the end-user, but then you are paying for a long running and almost idle serverless function, and the end-user has a less than ideal experience.
Leverage IO streaming tasks to whom is designed for it.
- You can leverage data streaming tasks to Signway. Your signed requests, if authentic and not expired, will be proxy-ed to the destination API directly from the user's device.
- The end-user will receive the streaming response directly on his device, and your serverless function only needed to create a pre-signed url for the end-user to use, no data streaming involved.
Pay as you go model
1Mb/request data transfer
100 Mb/request data transfer
0.009 $/Mb transferred