Create "packages" of context for popular API & Libraries - make them freely available via public url
Keep the packages up to date. They'll be newer than the cutoff date for many models, and they'll be cleaner then the data a model slurps in using web search.
Voila, you're now the trusted repository of context for the developer community.
You'll get a lot of inbound traffic/leads, and ideas for tangential use cases with commercial potential.
I love this, it can be very useful to have ready to use library of context data and at the same time a perfect solution to bring in new users. Thanks so much.
To be free and not forward credentials I built an alternative to all-in-one chatting without auth and access to search API through the web:
https://llmcouncil.github.io/llmcouncil/
Provides simple interface to chat with Gemini, Claude, Grok, Openai, deepseek in parallel
I use RooCode and find it quite effective with the ability to switch agents and models within the tasks currently. I recently moved from Cline to RooCode.
I thought about building something along these lines (not the same but vaguely similar).
Then Gemini AI Studio came along with a 1 million token window and allowed me to upload zip files of my entire code base and I lost interest in my own thing.
Is it not a bit weird to freely give give away your entire code base (I assume it's personal, not your company's, but maybe I'm wrong) to an entity like Google?
Yes the long context it's complementary, in other chat services like Gemini you have to rewrite that base context everytime for each new fresh chat, plus they lack of specific data import tools and projects management
Easiest way to prove it can't handle the full context in reality is to upload a one hour documentary movie with audio and ask it to write timestamps of chapters/critical moments. It can't handle this beyond 10 minutes even remotely reliably.
An agentic flow can solve this within an existing UI/app; I already use such a workflow when I have to bring in project documentation. That will be your competition.
Since it's a commercial product and feedback can be useful: people would generally be hesitant to leave their existing apps if there's a workaround. There's a certain stickiness to them, even ChatGPT. Personally I use self-hosted LibreChat, and the history and additional features it provides are important to me.
What I can do is to make it very transparent on how data is managed.
The files content are appended to the context builder, then the context and messages are processed through OpenRouter, which is a provider that offers APIs to all the AI models, and the output generated (and the account data) is stored on a secured database on Mongodb platform.
Isn't NotebookLM already exactly web and file context (a "ContextChat")?
Edit: I assume it is basically a similar product, but your differentiators are mainly the customer getting to choose their model, and you getting to write your own context adding ergonomics (like adding links from a Sitemap)?
Exactly, similar plus tools to import and manage projects context fast (like GitHub private repos and sitemaps url), multiple ai model and pay per use like using APIs
I think it would be better if it was just context and not connected to any model. Think of one place where you can hook in your drive folder, GitHub, etc. and have it produce the best context for the task you want to achieve. Then users can copy that to their model or workflow of choice
Thank you, this could be a cool feature too add! For example the ability to click a link that redirects to other chat services with your project base context you built and optionally all the messages sent until there
For what I can see it doesn't offers the flexibility of importing content from a detailed sitemap or private GitHub repositories in a fast way (and more tools to come).
Then it doesn’t has the possibility to switch to different AI models plus you have to pay a monthly subscription.
Cool! I was excited when I saw this and signed up.
One key thing I was hoping for was a consistent resync with source material particularly google docs. Looks like I'll have to download then upload to your app whenever they change.
Yes this sounds very useful and productive. Having project context updated based on external events. I will share it on socials when ready, thanks for the feedback!
I tried to solve my own problems that I had while copy and pasting the same starting context from chat to chat. Now I can generate the base context and start new chats from there.
You and I are going to end up competing because im evovling my original solution in this space, https://github.com/backnotprop/prompt-tower ... best of luck, great execution thus far.
So now our jobs are shifting from doing work, to telling the AI to do work, so now we need management tools to better manage how we are telling the AI to do work.
Create "packages" of context for popular API & Libraries - make them freely available via public url
Keep the packages up to date. They'll be newer than the cutoff date for many models, and they'll be cleaner then the data a model slurps in using web search.
Voila, you're now the trusted repository of context for the developer community.
You'll get a lot of inbound traffic/leads, and ideas for tangential use cases with commercial potential.
Provides simple interface to chat with Gemini, Claude, Grok, Openai, deepseek in parallel
Then Gemini AI Studio came along with a 1 million token window and allowed me to upload zip files of my entire code base and I lost interest in my own thing.
Just upload a novel and ask it questions, you'll see how it botches simple stuff
Since it's a commercial product and feedback can be useful: people would generally be hesitant to leave their existing apps if there's a workaround. There's a certain stickiness to them, even ChatGPT. Personally I use self-hosted LibreChat, and the history and additional features it provides are important to me.
Yes I will work on make the context management more productive with a ready to use service and with abilities to switch from other services easily.
What I can do is to make it very transparent on how data is managed.
The files content are appended to the context builder, then the context and messages are processed through OpenRouter, which is a provider that offers APIs to all the AI models, and the output generated (and the account data) is stored on a secured database on Mongodb platform.
It’s all defined in the privacy policy here: https://contextch.at/docs/privacy-policy/index.html
[1] https://techpolicy.sanford.duke.edu/blogroll/fortune-500-com...
[2] https://techpolicy.sanford.duke.edu/blogroll/examining-data-...
Edit: I assume it is basically a similar product, but your differentiators are mainly the customer getting to choose their model, and you getting to write your own context adding ergonomics (like adding links from a Sitemap)?
I think it would be better if it was just context and not connected to any model. Think of one place where you can hook in your drive folder, GitHub, etc. and have it produce the best context for the task you want to achieve. Then users can copy that to their model or workflow of choice
https://www.anthropic.com/news/projects
Then it doesn’t has the possibility to switch to different AI models plus you have to pay a monthly subscription.
One key thing I was hoping for was a consistent resync with source material particularly google docs. Looks like I'll have to download then upload to your app whenever they change.
Is that right? Auto syncing in the plan?
You and I are going to end up competing because im evovling my original solution in this space, https://github.com/backnotprop/prompt-tower ... best of luck, great execution thus far.
I must have taken a turn to the wrong timeline.
That ability will decay of course and you will be managing with the best of them. Eh, I mean the worse :)