Rendered at 22:42:38 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
chaz6 8 hours ago [-]
When I got the update I looked through the settings and there appears to be no way to disable it. I do not want AI anywhere near my database. I only use it for testing/staging at least so I should hopefully not have to worry about it wrecking production.
ziml77 8 hours ago [-]
What's the danger? It can see the schemas to help it generate the queries but it can't run anything on its own. Also you have to give the application credentials to an AI provider for the feature to work. So, you can just not do that.
adamas 8 hours ago [-]
There is no need of potential dangers to not want to have non-deterministic features in an application.
justinclift 51 minutes ago [-]
> What's the danger?
Hallucinated ideas about what needs doing, what commands to run, etc.
So, data that's no longer reliable (ie could be subtly changed), or even outright data loss.
Yeah, no thanks. I switched to dbeaver already anyway, because pgadmin was annoying about to which postgres versions it could connect. Too much of a hassle to setup a new version from source back when I tried. With dbeaver I just run ./dbeaver from the extracted .tag.gz. dbeaver is also not a web interface, but a real desktop application (Java, though).
david_iqlabs 45 minutes ago [-]
The interesting challenge with AI assistants inside technical tools is grounding the responses in real system signals. If the model is just interpreting natural language prompts, it tends to produce generic advice. But if the assistant is tied directly to system telemetry or query results, it becomes much more useful.
In experiments I’ve been running, the pattern that seems to work best is deterministic signals first, then a constrained AI layer that interprets those signals rather than inventing analysis.
david_iqlabs 46 minutes ago [-]
One thing I’ve learned building AI-assisted tooling is that the usefulness depends heavily on how constrained the AI layer is. If the AI is generating suggestions without grounding in deterministic signals from the system, it tends to produce very confident but generic output.
What ended up working better in my own experiments was system signals,structured scoring, AI narrative on top of the signals.
Fuzzwah 2 hours ago [-]
While everyone else is posting top level comments about which tools they're using rather than PgAdmin; I've been a huge fan of Beekeeper Studio since I tried out a range of postgresql db apps such as DBeaver, Postico, etc a few years ago.
Click on the "Reset layout" button in the query tool (located in the top right corner), and it will move the "AI Assistant" tab to the right. Now, when you query a table, it will default to the Query tab as always.
jplaz 5 hours ago [-]
Switched from DBeaver to DataGrip and I couldn't be happier.
aitchnyu 8 hours ago [-]
Might as well choose our AI subscription for our tools. I always hated the sparkle icons in Mongodb Compass (db browsing tool), Cloudwatch (logs) etc which is wired to a useless model. So I always chose to write Python scripts to query Postgres and other DBs and render pretty tables to CLI.
zbentley 7 hours ago [-]
Eh, as someone generally on the skeptical end of the spectrum for a lot of AI-assisted ops tasks, exploratory query generation is a great use case for it.
I’m highly proficient in code, only average at SQL, and am routinely tasked to answer one-off questions or prototype reporting queries against highly complex schemas of thousands of tables (owned by multiple teams and changing all the time, with wildly insufficient shared DAO libraries or code APIs for constructing novel queries). My skill breakdown and situation aren’t optimal, certainly, but they aren’t uncommon either.
In that context, being able to ask “write a query that returns the last ten addresses of each of the the highest-spending customers, but only if those addresses are in rhetorical shipment system and are residences, not businesses”. Like, I could figure out the schemas of the ten tables involved in those queries and write those joins by hand, slowly. That would take time and, depending on data queries, the approach might get stale fast.
stuaxo 7 hours ago [-]
If I can use this with a local LLM it could be useful.
kay_o 5 hours ago [-]
In ollama is included default add the endpoint URL yourself
zbentley 7 hours ago [-]
Yeah. This seems like an area where a “tiny” (2-4GB) local model would be more than sufficient to generate very high quality queries and schema answers to the vast majority of questions. To the point that it feels outright wasteful to pay a frontier model for it.
8 hours ago [-]
msavara 6 hours ago [-]
No thank you. One of the worst ads for python that exists. The only one worse than pgAdmin is Windows 11.
allthetime 4 hours ago [-]
postico is really nice on macos
naranha 8 hours ago [-]
The only interface that works for me efficiently with LLMs is the chatbot interface. I rather copy and paste snippets into the chat box than have IDEs and other tools guess what I might want to ask AI.
The first thing I do with these integration is look how I can remove them.
Hallucinated ideas about what needs doing, what commands to run, etc.
So, data that's no longer reliable (ie could be subtly changed), or even outright data loss.
If it is just calling API anyway, then I don't want to have this in my db admin tool. It also expose surface area of potential data leakage.
"This feature requires an AI provider to be configured in Preferences > AI."
And then you have to supply an API key (see here https://www.pgedge.com/blog/ai-features-in-pgadmin-configura... )
You don't get AI for free!
In experiments I’ve been running, the pattern that seems to work best is deterministic signals first, then a constrained AI layer that interprets those signals rather than inventing analysis.
What ended up working better in my own experiments was system signals,structured scoring, AI narrative on top of the signals.
https://www.beekeeperstudio.io
Click on the "Reset layout" button in the query tool (located in the top right corner), and it will move the "AI Assistant" tab to the right. Now, when you query a table, it will default to the Query tab as always.
I’m highly proficient in code, only average at SQL, and am routinely tasked to answer one-off questions or prototype reporting queries against highly complex schemas of thousands of tables (owned by multiple teams and changing all the time, with wildly insufficient shared DAO libraries or code APIs for constructing novel queries). My skill breakdown and situation aren’t optimal, certainly, but they aren’t uncommon either.
In that context, being able to ask “write a query that returns the last ten addresses of each of the the highest-spending customers, but only if those addresses are in rhetorical shipment system and are residences, not businesses”. Like, I could figure out the schemas of the ten tables involved in those queries and write those joins by hand, slowly. That would take time and, depending on data queries, the approach might get stale fast.
The first thing I do with these integration is look how I can remove them.