Nov 13th, 2024 - Ollama, Qwen2.5-Coder, Continue, and Rider: Your Local Copilot

Опубликовано: 17 Ноябрь 2024
на канале: Jamie Taylor
376
13

On November 13th, 2024, Jamie ran a short (for him anyway) stream showing off how he had leveraged Ollama, some free language models (qwen2.5-coder and llama3.2), and Continue to create a GitHub Copilot-like experience in the entire suite of JetBrains products.

This was originally streamed on Twitch at   / gaprogman  

This stream focused on using Rider and a .NET project of his, but the Continue config (which is stored in your user area) is automatically picked up by any JetBrains product AND by Visual Studio Code if the Continue extensions are installed.

Links mentioned in this stream:

Ollama - run large language models locally: https://ollama.com/
Llama3.2 - a small language model created by Meta: https://ollama.com/library/llama3.2/
Qwen2.5-Code - a large language model trained on code: https://ollama.com/library/qwen2.5-coder
Continue - add GitHub Copilot-like functionality to VS Code or any JetBrains product: https://www.continue.dev/
OwaspHeaders.Core - a NuGet package for adding OWASP recommended HTTP headers to ASP .NET Core applications: https://www.nuget.org/packages/OwaspH...
Chillhop - the chillest music on the Internet: https://chillhop.com/
The Modern .NET Show - the premier .NET podcast, focusing entirely on the knowledge, tools, and frameworks that all .NET developers should have in their toolbox: https://dotnetcore.show/