2

I added comments in my postgres table using this method, statement ran successfully.

COMMENT ON TABLE statements IS 'Table storing bank transaction data imported from statements sheet';
COMMENT ON COLUMN statements.id IS 'Unique identifier for each transaction';
COMMENT ON COLUMN statements.customer_id IS 'Reference to the customer, alphanumeric ID';

I am using this query to get comments on the table

SELECT 
    column_name,
    data_type,
    col_description(('public.' || 'statements')::regclass, ordinal_position) AS column_comment
FROM 
    information_schema.columns
WHERE 
    table_schema = 'public'
    AND table_name = 'statements';

Here is the result result of query

What am I missing, how to add comments properly? or how to retreive info properly?

I tried this but it didnt work, error says user_col_comments doesnt exist

Here is my client code

from sqlalchemy import create_engine, text
connection_string = f"postgresql://{postgres_username}:{postgres_password}@{postgres_host}:{postgres_port}/{postgres_database}"
engine = create_engine(connection_string)

def execute_sql_from_file(engine, file_path):
    """
    Execute SQL queries from a file.
    
    Args:
        engine: SQLAlchemy engine object
        file_path: Path to the SQL file
    
    Returns:
        List of results from executed queries
    """
    # Read the SQL file
    with open(file_path, "r") as file:
        sql_queries = file.read()

    # Split the queries if there are multiple
    queries = sql_queries.split(";")

    # Store results
    results = []

    # Execute each query individually
    with engine.connect() as connection:
        for query in queries:
            query = query.strip()
            if query:  # Ensure the query is not empty
                print(f"Executing query: {query}")
                result = connection.execute(text(query))

                # If the query returns data (SELECT), store it in results
                if result.returns_rows:
                    results.append(result.fetchall())
                print("Query executed successfully")

    return results
9
  • Did you COMMIT the table/column statements? Commented Mar 28 at 22:34
  • no i didnt, how can I do that?
    – Kundan
    Commented Mar 28 at 22:39
  • doesnt postgres automatically commit, comment?
    – Kundan
    Commented Mar 28 at 22:41
  • 1) autocommit depends on the client. What client are you using? 2) The query works for me here. Commented Mar 28 at 22:42
  • 1
    Read SQLAlchemy engine.connect in particular section Using Transactions. Without the transaction coding you get: When the connection is returned to the pool for re-use, the pooling mechanism issues a rollback() call on the DBAPI connection so that any transactional state or locks are removed, and the connection is ready for its next use. Commented Mar 28 at 23:05

1 Answer 1

1

Add connection.commit() after the for loop if you're sticking to the commit as you go style (autobegin behaviour):

    with engine.connect() as connection:
        for query in queries:
            query = query.strip()
            if query:  # Ensure the query is not empty
                print(f"Executing query: {query}")
                result = connection.execute(text(query))
                # If the query returns data (SELECT), store it in results
                if result.returns_rows:
                    results.append(result.fetchall())
                print("Query executed successfully")
        connection.commit()############################# here

    return results

Or a with connection.begin(): context manager around it to switch to begin once:

    with engine.connect() as connection:
        with connection.begin():######################### here
            for query in queries:
                query = query.strip()
                if query:  # Ensure the query is not empty
                    print(f"Executing query: {query}")
                    result = connection.execute(text(query))
                    # If the query returns data (SELECT), store it in results
                    if result.returns_rows:
                        results.append(result.fetchall())
                    print("Query executed successfully")

    return results

Without these, your with engine.connect() as connection: is just closing the session without persisting changes you did in that context, so they get discarded. Quoting the SQLAlchemy doc already recommended by @Adrian Klaver:

the connection object does not assume changes to the database will be automatically committed, instead requiring in the default case that the connection.commit() method is called in order to commit changes to the database.

It's typically preferrable to be able to roll back all steps of your query set if any of them fails. If you want the result of all successfully executed queries to be saved, up to the one that failed, you can connection.commit() after each individual query, at the end of each iteration of the loop - or push the with connection.begin(): down inside the loop.


Here's an example showing concurrent sessions don't see the comments until you commit, reproducing your initial problem.

4
  • Hi, I tried both the approaches but still I am not able to see the comments.
    – Kundan
    Commented Mar 29 at 21:11
  • @Kundan In that case it could mean you're not creating and commenting the table on public but instead in the $user schema (named after the postgres_username you're connecting as to run those queries), or that you're running those statements on one database, then connecting to another to check it, or one that's named the same but sits on a different cluster/server altogether. You can have multiple Postgres clusters and versions on one machine, on different ports.
    – Zegarek
    Commented Mar 29 at 21:24
  • I am using this particular connection string in both the cases : postgresql://postgres:kundan@localhost:5432/sympera , It should connect to the same schema right?
    – Kundan
    Commented Mar 30 at 1:39
  • You can check with show search_path;. If the client writing the comments is a python app, what do you use to check them (psql,DBeaver,pgAdmin,DataGrip)? Is this db behind a tunnel routed out from localhost:5432, or is it really local? How many databases do you have on the same host and how are they set up (regular install, VM, docker containers)?
    – Zegarek
    Commented Mar 30 at 8:01

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.