LOADING...
LOADING...
An IT worker in finance wants to create a **local AI assistant** that can read client data securely, provide daily overviews, and connect to PowerBI, Excel, and Outlook without exposing data externally. They're planning to run LlaMA on a Mac Studio with Open Web UI, implement audit logging via nginx reverse proxy, and scale from 5-10 users to 50 users. **They're questioning whether this setup is realistic or if they're caught up in AI hype.**
Hi there, I work in the IT department of a financial industry and dabbled with creating our local ai. I got the following requirements: \-Local AI / should be able to work as an assistant (so give a daily overview etc) / be able to read our data from clients without exposing it to the outside
As far as I understand, I can run LlaMA on a Mac Studio inside our local network without any problems and will be able to connect via MCP to Powerbi, Excel and Outlook. I wanted to expose it to Open Web UI, give it a static URl and then let it run (would also work when somebody connects via VPN to the server) .
I was also asked to be able to create an audit log of the requests (so which user, what prompts, documents, etc). Claude gave me this: nginx reverse proxy , which I definetly have to read into.
Am I just babbled by the AI Hype or is this reasonable to run this? (Initially with 5-10 users and then upscale the equipment maybe? for 50)