duplicate key value violates unique constraint

Augusto Piva 35 Reputation points
2025-01-07T20:20:31.5433333+00:00

Hi,

I'm experiencing this issue from time to time when inserting values into a PostgreSQL database as output from the stream analytic, every 15 minutes

Here the detailed version of it:
First Occurred: 1/7/2025 8:09:22 PM UTC | Resource Name: occupancy | Message: Encountered error trying to write 36 event(s): XX000: duplicate key value violates unique constraint "occupancy_pkey_102650" DETAIL: Detail redacted as it may contain sensitive data. Specify 'Include Error Detail' in the connection string to include this information.

I have my ASA job configure to use a Drop Error policy , but this error seems to be still slowing down the service.

I found out on ASA docs that the way you prevent this error is by follows in Azure SQL database

https://learn.microsoft.com/en-us/azure/stream-analytics/stream-analytics-troubleshoot-output#key-violation-warning-with-azure-sql-database-output

But in my case my db is PostgreSQL and I don't have that ignore dup key option azure sql has.

Anyone has experienced this and was able to solved it?

What's your thought on this?

Thanks,

Augusto

Azure Stream Analytics
Azure Stream Analytics
An Azure real-time analytics service designed for mission-critical workloads.
369 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. AnnuKumari-MSFT 33,981 Reputation points Microsoft Employee
    2025-01-09T06:30:02.0266667+00:00

    Hi Augusto Piva ,

    Thankyou for using Microsoft Q&A platform and thanks for posting your query here.

    It sounds like you're encountering a common issue with PostgreSQL where duplicate key

    Instead of ignore dup key, Use the ON CONFLICT clause with DO NOTHING or DO UPDATE to handle conflicts:

    INSERT INTO table_name (column1, column2)
    VALUES (value1, value2)
    ON CONFLICT (unique_column) DO NOTHING;
    

    For more details, kindly check this documentation: https://www.postgresql.org/docs/current/sql-insert.html#SQL-ON-CONFLICT

    For the existing duplicates, identify the duplicate value causing the error and clean up the data by removing or updating duplicates:

    SELECT column_name, COUNT(*)
    FROM table_name
    GROUP BY column_name
    HAVING COUNT(*) > 1;
    
    DELETE FROM table_name
    WHERE ctid NOT IN (
        SELECT MIN(ctid)
        FROM table_name
        GROUP BY unique_column
    );
    

    Hope it helps. Kindly accept the answer by clicking on Accept answer button . Thankyou

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.