MENU

Fun & Interesting

Deploying Serverless Inference Endpoints

Trelis Research 3,059 1 year ago
Video Not Working? Fix It Now

➡️ ADVANCED-inference Repo (incl. setup guide and inference scripts): https://trelis.com/enterprise-server-api-and-inference-guide/ ➡️ RunPod Affiliate Link (supports the Trelis channel): https://tinyurl.com/4b6ecbbn VIDEO RESOURCES: - Slides: https://docs.google.com/presentation/d/1wlodg6PbvbYCDGoMGzAP7T7NwY2sL346QpHQymYTm_Y/edit?usp=sharing OTHER TRELIS LINKS: ➡️ One-click API Templates: https://github.com/TrelisResearch/one-click-llms/ ➡️ ADVANCED-fine-tuning Repo: https://trelis.com/ADVANCED-fine-tuning-scripts/ ➡️ ADVANCED-vision Repo: https://trelis.com/ADVANCED-vision/ ➡️ ADVANCED-transcription Repo: https://trelis.com/ADVANCED-transcription/ ➡️ Trelis Function-calling Models: https://trelis.com/function-calling/ ➡️ Trelis Newsletter: https://Trelis.Substack.com ➡️ Tip Jar: https://ko-fi.com/trelisresearch TIMESTAMPS: 0:00 Deploying a Serverless API Endpoint 0:17 Serverless Demo 1:19 Video Overview 1:44 Serverless Use Cases 3:31 Setting up a Serverless API 13:31 Inferencing a Serverless Endpoint 17:55 Serverless Costs versus GPU Rental 20:02 Accessing Instructions and Scripts

Comment