:IF: Queeni AI Assistant

uhm … one issue you might run into is that you possibly have an embedding model too in your chain … that uses GPU space as well … 8GB doesn’t offer much headroom :smiley: … but without starting somewhere (at the lower limit) you won’t know what you really need :slight_smile:

2 Likes

And if someone wants to use decentralized AI but doesn’t want to run local infrastructure, they can use https://nosana.com to rent GPU power. If they can afford around $30–40 per month, it’s a great alternative to ChatGPT - their own truly decentralized AI, without needing any local setup.

That way, Queeni can run the same personal AI model across all devices - desktop, mobile, different OS - all synced and powered by one shared decentralized brain. :brain::globe_showing_europe_africa::high_voltage:

4 Likes

I think I need to port-forward my local model-server then :shaking_face:

2 Likes

Will Queeni be able to use openAI API?

If so, there is also Venice.ai which uses the OpenAI API specification and allows you to pay with bitcoin. ($149 / year) So if you use their API through a proxy and pay with BTC, then you are relatively anon. They offer an uncensored AI experience and have a solid privacy policy too.

3 Likes

Yes, Queeni currently uses the OpenAI API.
You just need to create an OpenAI account, generate an API key, and paste it into Queeni’s settings. From that point on, usage is tied to your key, and you pay OpenAI directly.

About Venice.ai - if they’re using OpenAI under the hood, I’m genuinely curious how they guarantee an “uncensored” experience and “solid privacy.” :thinking:
In the end, all your prompts still go through OpenAI’s servers, so whatever privacy you think you gain by paying with BTC, you’re still handing over your data to the same model provider.

Unless they’ve found a magical black hole to reroute tokens through - it’s still OpenAI behind the curtain :grinning_face_with_smiling_eyes:

3 Likes

They don’t use OpenAI, they just use the same specification as OpenAI’s API. Meaning that communication between their AI and an app that uses the API is the same, thus allowing you to use either AI for the app.

2 Likes

Ah, yes. Easy to switch from one to another api when they are same.

1 Like

There are many APIs use OpenAI specification, but I’m not sure which of them support the full Function Calling capabilities. For Queeni, it’s not just about chatting - the assistant can actually create categories, tasks, and notifications inside the app, and save them directly to the database.

So in the video, you’re not just seeing a conversation - you’re seeing real interaction between AI and the app logic. That means Queeni doesn’t just talk. She acts. Queeni create a task in windows app and it popup in the android app.

In theory, this same mechanism could be extended to interact with native APIs on the phone. MAUI allow this.

For example:

  1. Grandma telling Queeni: “Set my alarm for 5 in the morning, I’ve got pickles to jar!” :cucumber::alarm_clock: And Queeni doesn’t just say “Okay, I’ll remind you” - She actually sets the real alarm on Grandma’s Android phone. Not just suggesting one, but triggering it through the system.

  2. You ask Queeni on your Windows laptop: “Remind me to feed the cat at 7 PM.” :cat_face::alarm_clock: Queeni creates a task right there in the Windows app - and then, like magic, the reminder pops up on your Android phone too. One assistant. All your devices. No cat left unfed.

That’s the kind of assistant we’re talking about.
That’s the direction I’m aiming for - AI with real-world actions, not just text.

3 Likes