[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]

/b/ - Random

Name
Email
Subject
Comment
File
Password (For file deletion.)

File: 1771158106833.jpg (201.37 KB, 1080x608, img_1771158097851_56p9f6n5.jpg)ImgOps Exif Google Yandex

f1aa3 No.1327

so i've been playing around getting offline ai suggestions for coding directly into vs-code via the ongoing project called ollama + their new tool - *continue*. it's pretty cool because you can keep your projects super private while still leveraging some smart auto-complete features. basically, instead of having to connect online or use public models like chatgpt (which might not always respect privacy), this setup keeps everything local on the machine. have anyone tried setting up something similar? i'd love hear about any tips and tricks you've picked along your journey!

Source: https://www.sitepoint.com/local-llm-code-completion-vs-code-ollama/?utm_source=rss

f1aa3 No.1328

File: 1771159131513.jpg (125.15 KB, 1880x1253, img_1771159116086_gvtucxgk.jpg)ImgOps Exif Google Yandex

had the same setup issue with ollama and vscode. turns out i needed to update my vs code extensions first before it worked smoothly… wasted a good hour on that one! hope this helps you avoid some frustration ly got through it though, so not all doom & gloom here



[Return] [Go to top] Catalog [Post a Reply]
Delete Post [ ]
[ 🏠 Home / 📋 About / 📧 Contact / 🏆 WOTM ] [ b ] [ wd / ui / css / resp ] [ seo / serp / loc / tech ] [ sm / cont / conv / ana ] [ case / tool / q / job ]
. "http://www.w3.org/TR/html4/strict.dtd">