Local LLM access, like LM Studio improvement
under review
R
Rose Cuckoo
Hello, I have my AI server with all of my LLM tools from LM Studio to Automatic1111, ComfyUI, etc on a local system on my network. Currently comming to LM Studio requires my usage of NovelCrafter to be opened in the browser on that service. Could you please add option to put in the IP, rather than assume localhost on 127.0.0.1?
Thank you for the consideration. This tool is amazing.
spaceemotion
under review
spaceemotion
A while back, from internal testing we were unable to have LM Studio run on anything other than the exact 127.0.0.1 IP. A user asked on their discord and it was said that this behavior was intended. Has this changed in the meantime.
R
Rose Cuckoo
spaceemotionwhile LM Studio allows for picking the port and uses localhost which works for both 127.0.0.1 AND the servers IP. I connect other apps to API across local network.
spaceemotion
Rose Cuckoo: huh! that's good to know, will have to test that again :)