Sample Header Ad - 728x90

Database Administrators

Q&A for database professionals who wish to improve their database skills

Latest Questions

0 votes
1 answers
163 views
Seeking Guidance on Migrating MySQL/MariaDB Data to Supabase
I find myself facing a challenge, and I'm reaching out for some guidance. My current project involves a rebuild for a client, and we're considering adopting the Supabase stack. However, our existing MySQL database houses a substantial amount of data, and I'm a bit uncertain about the migration proce...
I find myself facing a challenge, and I'm reaching out for some guidance. My current project involves a rebuild for a client, and we're considering adopting the Supabase stack. However, our existing MySQL database houses a substantial amount of data, and I'm a bit uncertain about the migration process. I've already exported the data in a .sql file, which includes table creation, relations, and the data itself. While I've come across the Supabase documentation on migrating from MySQL here , I admit there are aspects that I'm struggling to grasp. Has anyone here successfully migrated from MySQL/MariaDB to Supabase before? I would greatly appreciate any insights, tips, or resources that could help me navigate this process more smoothly. If you've encountered similar challenges or have valuable experience with Supabase migrations, your advice would be gold to me right now. Thanks a bunch in advance! **Update Nov 14, 2023:** So i used my mac, instead of windows to be able to use the pgloader. What i did was to install pgloader using brew install pgloader (since it automatically install dependencies). So i have created a folder and set up a file config.load and use this: load database from mysql://user:@host/source_db into postgres://postgres:XXX@db.xxxx.supabase.co:XXXX/postgres alter schema 'public' owner to 'postgres'; set wal_buffers = '64MB', max_wal_senders = 0, statement_timeout = 0, work_mem to '2GB'; mysql's password is null since it's from xampp mysql and it is default to null while on postgres i got it from my database settings on supabase. take note that this is online and not local setup supabase. and i am encountering an [error at Line 4](Image **Note:** I have moved this post from StackOverflow to here since it's related to a database, but I can't seem to tag Supabase here because I can't add tags due to the 300 reputation requirement. They told me to post it here, so please don't hate me for using the Postgres tag, but Supabase is built on Postgres, and this is the only tag I can use. I've also added the MySQL tag to increase visibility for my post. enter image description here
Mathew Agustin Bella (101 rep)
Nov 14, 2023, 01:48 PM • Last activity: Jul 7, 2025, 08:02 PM
0 votes
0 answers
196 views
Is it possible to create a table in Supabase programmatically using the Python client?
I would like to create a table on Supabase programmatically. Is this possible? Using this code: ```python from supabase_client import supabase # SQL statements to create tables sql_create_tables = """--sql CREATE TABLE IF NOT EXISTS biomarker ( name VARCHAR(255) PRIMARY KEY, unit VARCHAR(100) NOT NU...
I would like to create a table on Supabase programmatically. Is this possible? Using this code:
from supabase_client import supabase

# SQL statements to create tables
sql_create_tables = """--sql
CREATE TABLE IF NOT EXISTS biomarker (
    name VARCHAR(255) PRIMARY KEY,
    unit VARCHAR(100) NOT NULL,
    reference_range_min VARCHAR(100),
    reference_range_max VARCHAR(100),
    description TEXT
);
"""

# Execute raw SQL to create the tables
response = supabase.rpc("execute_sql", {"query": sql_create_tables}).execute()
print(response)
However, I am getting this error when running this code:
postgrest.exceptions.APIError: {'code': 'PGRST202', 'details': 'Searched for the function public.execute_sql with parameter query or with a single unnamed json/jsonb parameter, but no matches were found in the schema cache.', 'hint': None, 'message': 'Could not find the function public.execute_sql(query) in the schema cache'}
Is it just not possible to create a table programmatically on Supabase?
Paul Razvan Berg (101 rep)
Dec 27, 2024, 09:05 AM
0 votes
0 answers
32 views
Preventing Race Conditions with SERIALIZABLE Isolation in Supabase for High-Concurrency Updates
**Question:** I'm working with Supabase/PostgreSQL in a Next.js application and need to ensure that an is_processing flag is updated only once per user, even if multiple requests are sent in parallel. The goal is to prevent any duplicate operations by ensuring that once is_processing = true, additio...
**Question:** I'm working with Supabase/PostgreSQL in a Next.js application and need to ensure that an is_processing flag is updated only once per user, even if multiple requests are sent in parallel. The goal is to prevent any duplicate operations by ensuring that once is_processing = true, additional requests won’t proceed until it’s reset. **Approach So Far** I initially created a PostgreSQL function using SERIALIZABLE isolation to handle concurrency, with the function attempting to set is_processing to true if it’s initially false. Here’s the function:
CREATE OR REPLACE FUNCTION serialized_update_processing_flag(user_id UUID) RETURNS BOOLEAN AS $$
DECLARE
    result BOOLEAN;
BEGIN
    SET TRANSACTION ISOLATION LEVEL SERIALIZABLE;

    UPDATE users
    SET is_processing = true
    WHERE id = user_id
      AND is_processing = false
    RETURNING true INTO result;

    RETURN result IS NOT NULL;
EXCEPTION
    WHEN serialization_failure THEN
        RETURN FALSE;
END;
$$ LANGUAGE plpgsql;
**Problem:** Running this function directly throws an error: "SET TRANSACTION ISOLATION LEVEL must be called before any query." **Alternative Implementation in Next.js** I moved the transaction management into my Next.js code, starting with SERIALIZABLE isolation at the application level. Here’s the revised approach:
const client = await pool.connect();

try {
  // Start a transaction with SERIALIZABLE isolation level
  await client.query('BEGIN');
  await client.query('SET TRANSACTION ISOLATION LEVEL SERIALIZABLE');

  // Call the RPC function
  const { rows } = await client.query(
    'SELECT serialized_update_recharge_flag($1) AS success',
    [userId]
  );

  await client.query('COMMIT');

  // Check the result
  console.log('ROWWW', rows);
  return rows.success;
} catch (error) {
  await client.query('ROLLBACK');
  console.error(Error in serialized transaction: ${error});
  throw error;
} finally {
  client.release();
}
Issue with High-Concurrency Requests This approach has been somewhat effective, but when I simulate around 30 parallel requests, some of them still manage to bypass the SERIALIZABLE isolation level, resulting in multiple operations setting is_processing = true concurrently. **Questions** How can I reliably enforce strict SERIALIZABLE isolation in this setup to prevent any concurrent updates from slipping through? Are there alternative methods in PostgreSQL or Supabase for handling high-concurrency, single-update situations with atomicity? I’d appreciate any suggestions or solutions to ensure consistent behavior and prevent duplicate operations, especially under high parallel loads. Thank you!
Obaid Aqeel (101 rep)
Nov 6, 2024, 02:59 PM
0 votes
1 answers
43 views
Postgres trigger with after and on row appears to not run after writing to row
I have the following two tables: ``` CREATE TABLE qanda.group_questions ( id uuid DEFAULT gen_random_uuid() NOT NULL, category_id uuid NULL, business_model_id int4 NULL, industry_id int4 NULL, business_structure_id int4 NULL, CONSTRAINT group_questions_pkey PRIMARY KEY (id), CONSTRAINT group_questio...
I have the following two tables:
CREATE TABLE qanda.group_questions (
	id uuid DEFAULT gen_random_uuid() NOT NULL,
	category_id uuid NULL,
	business_model_id int4 NULL,
	industry_id int4 NULL,
	business_structure_id int4 NULL,
	CONSTRAINT group_questions_pkey PRIMARY KEY (id),
	CONSTRAINT group_questions_business_model_id_fkey FOREIGN KEY (business_model_id) REFERENCES public.business_models(id) ON DELETE CASCADE,
	CONSTRAINT group_questions_business_structure_id_fkey FOREIGN KEY (business_structure_id) REFERENCES public.business_structure(id),
	CONSTRAINT group_questions_category_id_fkey FOREIGN KEY (category_id) REFERENCES public.category(id) ON DELETE CASCADE,
	CONSTRAINT group_questions_industry_id_fkey FOREIGN KEY (industry_id) REFERENCES public.industry(id) ON DELETE CASCADE
);

CREATE TABLE qanda.plain_text_questions (
	id uuid NOT NULL,
	question text NOT NULL,
	user_prompt text NULL,
	CONSTRAINT plain_text_questions_pkey PRIMARY KEY (id),
	CONSTRAINT plain_text_questions_id_fkey FOREIGN KEY (id) REFERENCES qanda.group_questions(id)
);
When a row is inserted into the group_questions table the following trigger fires:
create trigger on_group_question_inserted after
insert
    on
    qanda.group_questions for each row execute function notify_web_service_on_insert();
The trigger calls a function that calls a web service, which then should insert data in the plain_text_questions table. However it fails with this error:
"Error inserting data:",
        {
          "code": "23503",
          "details": "Key (id)=(50ec4620-a27b-429b-b05f-4fe786f116a9) is not present in table \"group_questions\".",
          "hint": null,
          "message": "insert or update on table \"plain_text_questions\" violates foreign key constraint \"plain_text_questions_id_fkey\""
        }
The trigger function is:
CREATE OR REPLACE FUNCTION public.notify_web_service_on_insert()
 RETURNS trigger
 LANGUAGE plpgsql
AS $function$
DECLARE
    details JSON;
    payload JSON;
    request extensions.http_request;
BEGIN
    -- Fetch details using the get_details_by_group_question_id function
    SELECT qanda.get_details_by_group_question_id(NEW.id) INTO details; -- this is selecting from qanda.group_questions demonstrating that the data is in the table

    -- Construct the JSON payload by adding the group_question_id
    payload := jsonb_set(details::jsonb, '{group_question_id}', to_jsonb(NEW.id::text));

    -- Construct the HTTP request structure
    request := (
        'POST',                                       -- Method
        'https://this_is_the_webservice/ ',                     -- URI
        ARRAY[extensions.http_header('Authorization', 'Bearer hello_world'),  -- Headers
              extensions.http_header('Content-Type', 'application/json')],
        'application/json',                           -- Content Type
        payload::text                                 -- Content
    )::extensions.http_request;

    -- Make the HTTP POST request
    PERFORM content FROM extensions.http(request);

    -- Return the NEW record to allow the insert operation to complete
    RETURN NEW;
END;
$function$
;
If I change the trigger to create trigger on_group_question_inserted before it fails, as it should. Based on my understanding before means before the row has been written to the table. But, after should mean it has been written. And due to the successful use of SELECT qanda.get_details_by_group_question_id(NEW.id) INTO details; that is correct. So, I'm left confused.
Magick (111 rep)
Jul 16, 2024, 08:54 PM • Last activity: Jul 17, 2024, 03:25 AM
0 votes
1 answers
311 views
Debug RPC Function Creation in Supabase
I'm pretty new to database stuff. I'm using Supabase to create an application where I keep track of the number of likes on certain items ('clicks'). I want to filter items either by the date the likes were added, or by certain categories the items have. So far I have a function that I can call from...
I'm pretty new to database stuff. I'm using Supabase to create an application where I keep track of the number of likes on certain items ('clicks'). I want to filter items either by the date the likes were added, or by certain categories the items have. So far I have a function that I can call from javascript like: const { data, error } = await supabase.rpc('rpc_test', { "json_params": { "categories": '{"fruits", "test"}', "start_date": "2024-04-16 00:22:35.837547+00", } }) Which should return all items that have a category matching the array I pass in, and the number of clicks that have been created since start_date and before and end_date if provided, or zero if no clicks have been created in that time window. And it so nearly works, but I keep running into errors that I don't know how to fix. The important tables in my database are: Items: | id | name | | -------- | -------------- | | 1 | apple | | 2 | beet | Clicks: | item_id | created_at | | -------- | -------------- | | 1 | 2024-04-09 | | 2 | 2024-04-09 | Categories: | id | name | | -------- | -------------- | | 1 | vegetable | | 2 | fruit | Item Categories: | item_id | category_id | | -------- | -------------- | | 1 | 2 | | 2 | 1 | My function query currently looks like this:
create
or replace function public.rpc_test (json_params json default null) returns table (
  id bigint,
  created_at timestamp with time zone,
  name text,
  clicks bigint,
  categories_arr text[]
) as $$
BEGIN
  RETURN QUERY
    select
      items.id,
      items.created_at,
      items.name,
      click_counts.clicks,
      item_id_to_cat_array.categories_arr
    from
      items
      LEFT JOIN (
          SELECT item_categories.item_id AS itemid, array_agg(categories.name) AS categories_arr
          FROM   item_categories
          JOIN   categories ON categories.id = item_categories.category_id
          GROUP  BY item_categories.item_id
      ) item_id_to_cat_array ON items.id = item_id_to_cat_array.itemid
      LEFT JOIN (
          SELECT item_id as click_item_id, count(c.id) AS clicks
          FROM clicks as c
          WHERE (json_params->>'start_date' IS NULL OR c.created_at >= (json_params->>'start_date')::date)
          AND (json_params->>'end_date' IS NULL OR c.created_at >'end_date')::date)
          GROUP BY c.item_id
      ) click_counts ON click_item_id = items.id
    where
        json_params->>'categories' IS NULL OR 
        (json_params->>'categories')::text[] && item_id_to_cat_array.categories_arr;
END;
$$ language plpgsql;
The only problem with this is that categories_arr never has any data. At various points I've had iterations of this that have worked for gathering the information I want but without the filtering in place. I've tried doing things with GROUP BY and HAVING instead, and I'm not really sure where to go. How can I get more information about what is happening in my query? I would like to see what categories_arr is at every step in the process, but I don't know how to log that information.
Rob (3 rep)
May 9, 2024, 03:35 AM • Last activity: Jun 5, 2024, 08:29 PM
2 votes
2 answers
1180 views
Merging information from multiple rows into one row
I have three tables event, person and event_person_map **event:** id integer PRIMARY KEY event_name text event_date date **person:** id integer PRIMARY KEY full_name text email_address text **event_person_map:** id integer PRIMARY KEY person_id integer [referencing person.id] event_id integer [refer...
I have three tables event, person and event_person_map **event:** id integer PRIMARY KEY event_name text event_date date **person:** id integer PRIMARY KEY full_name text email_address text **event_person_map:** id integer PRIMARY KEY person_id integer [referencing person.id] event_id integer [referencing event.id] How can I strucutre my query to get an output which lists information about the event and the list of participants? Example of the output would be something like this: event_id| event_name | event_date |participants| --------+------------+------------+------------+ 1 | event1 | 1.1.2024 | json list | 2 | event2 | 1.2.2024 | json list | 3 | event3 | 1.3.2024 | json list | 4 | event4 | 1.4.2024 | json list | Structure of the json list of participants should look something like this: [ { "id":1, "full_name":"John Doe", "email_address":"john@doe.com" }, { "id":2, "full_name":"Jane Doe", "email_address":"jane@doe.com" } ] I understand how to get list of participants separately but I am not sure what to do to make it be part of the query with event info. The output does not have to be strictly like this, if there are different solutions that are more elegant I am all ears.
Lugoom485 (45 rep)
Feb 25, 2024, 03:54 PM • Last activity: Feb 26, 2024, 04:41 AM
Showing page 1 of 6 total questions