Generating FastAPI Schemas from the Slack Clone¶
This example is a simple demonstration of how to generate FastAPI schemas from the Slack clone database tutorial provided by Supabase.
Prerequisites¶
You will need to have the following installed:
- Python 3.10 or higher and whatever environment and package manager you prefer (e.g.
pipenv
,poetry
,conda
,venv
, etc.) - The Supabase CLI
- This package
Setup¶
Start a new supabase instance locally using the supabase cli (see the Supabase CLI documentation for more information):
mkdir slack-clone
cd slack-clone
supabase init
# ... the cli will run through a setup process.
# ... after it finishes successfully, run the following command to start the project
supabase start
Navigate to the local Studio URL (usually http://127.0.0.1:54323
) and go to the SQL Editor. There, click the "Quickstarts" option and choose the "Slack Clone" quickstart. This will open an editor to create the necessary tables and sample data for the Slack clone. For completeness, the SQL code used in this example is provided here:
Slack Clone SQL Code
-- For use with https://github.com/supabase/supabase/tree/master/examples/slack-clone/nextjs-slack-clone
-- Custom types
create type public.app_permission as enum ('channels.delete', 'messages.delete');
create type public.app_role as enum ('admin', 'moderator');
create type public.user_status as enum ('ONLINE', 'OFFLINE');
-- USERS
create table public.users (
id uuid not null primary key, -- UUID from auth.users
username text,
status user_status default 'OFFLINE'::public.user_status
);
comment on table public.users is 'Profile data for each user.';
comment on column public.users.id is 'References the internal Supabase Auth user.';
-- CHANNELS
create table public.channels (
id bigint generated by default as identity primary key,
inserted_at timestamp with time zone default timezone('utc'::text, now()) not null,
slug text not null unique,
created_by uuid references public.users not null
);
comment on table public.channels is 'Topics and groups.';
-- MESSAGES
create table public.messages (
id bigint generated by default as identity primary key,
inserted_at timestamp with time zone default timezone('utc'::text, now()) not null,
message text,
user_id uuid references public.users not null,
channel_id bigint references public.channels on delete cascade not null
);
comment on table public.messages is 'Individual messages sent by each user.';
-- USER ROLES
create table public.user_roles (
id bigint generated by default as identity primary key,
user_id uuid references public.users on delete cascade not null,
role app_role not null,
unique (user_id, role)
);
comment on table public.user_roles is 'Application roles for each user.';
-- ROLE PERMISSIONS
create table public.role_permissions (
id bigint generated by default as identity primary key,
role app_role not null,
permission app_permission not null,
unique (role, permission)
);
comment on table public.role_permissions is 'Application permissions for each role.';
-- authorize with role-based access control (RBAC)
create function public.authorize(
requested_permission app_permission,
user_id uuid
)
returns boolean as
$$
declare
bind_permissions int;
begin
select
count(*)
from public.role_permissions
inner join public.user_roles on role_permissions.role = user_roles.role
where
role_permissions.permission = authorize.requested_permission and
user_roles.user_id = authorize.user_id
into bind_permissions;
return bind_permissions > 0;
end;
$$
language plpgsql security definer;
-- Secure the tables
alter table public.users
enable row level security;
alter table public.channels
enable row level security;
alter table public.messages
enable row level security;
alter table public.user_roles
enable row level security;
alter table public.role_permissions
enable row level security;
create policy "Allow logged-in read access" on public.users
for select using (auth.role() = 'authenticated');
create policy "Allow individual insert access" on public.users
for insert with check ((select auth.uid()) = id);
create policy "Allow individual update access" on public.users
for update using ( (select auth.uid()) = id );
create policy "Allow logged-in read access" on public.channels
for select using (auth.role() = 'authenticated');
create policy "Allow individual insert access" on public.channels
for insert with check ((select auth.uid()) = created_by);
create policy "Allow individual delete access" on public.channels
for delete using ((select auth.uid()) = created_by);
create policy "Allow authorized delete access" on public.channels
for delete using (authorize('channels.delete', auth.uid()));
create policy "Allow logged-in read access" on public.messages
for select using (auth.role() = 'authenticated');
create policy "Allow individual insert access" on public.messages
for insert with check ((select auth.uid()) = user_id);
create policy "Allow individual update access" on public.messages
for update using ((select auth.uid()) = user_id);
create policy "Allow individual delete access" on public.messages
for delete using ((select auth.uid()) = user_id);
create policy "Allow authorized delete access" on public.messages
for delete using (authorize('messages.delete', auth.uid()));
create policy "Allow individual read access" on public.user_roles
for select using ((select auth.uid()) = user_id);
-- Send "previous data" on change
alter table public.users
replica identity full;
alter table public.channels
replica identity full;
alter table public.messages
replica identity full;
-- inserts a row into public.users and assigns roles
create function public.handle_new_user()
returns trigger as
$$
declare is_admin boolean;
begin
insert into public.users (id, username)
values (new.id, new.email);
select count(*) = 1 from auth.users into is_admin;
if position('+supaadmin@' in new.email) > 0 then
insert into public.user_roles (user_id, role) values (new.id, 'admin');
elsif position('+supamod@' in new.email) > 0 then
insert into public.user_roles (user_id, role) values (new.id, 'moderator');
end if;
return new;
end;
$$ language plpgsql security definer;
-- trigger the function every time a user is created
create trigger on_auth_user_created
after insert on auth.users
for each row execute procedure public.handle_new_user();
/**
* REALTIME SUBSCRIPTIONS
* Only allow realtime listening on public tables.
*/
begin;
-- remove the realtime publication
drop publication if exists supabase_realtime;
-- re-create the publication but don't enable it for any tables
create publication supabase_realtime;
commit;
-- add tables to the publication
alter publication supabase_realtime add table public.channels;
alter publication supabase_realtime add table public.messages;
alter publication supabase_realtime add table public.users;
-- DUMMY DATA
insert into public.users (id, username)
values
('8d0fd2b3-9ca7-4d9e-a95f-9e13dded323e', 'supabot');
insert into public.channels (slug, created_by)
values
('public', '8d0fd2b3-9ca7-4d9e-a95f-9e13dded323e'),
('random', '8d0fd2b3-9ca7-4d9e-a95f-9e13dded323e');
insert into public.messages (message, channel_id, user_id)
values
('Hello World 👋', 1, '8d0fd2b3-9ca7-4d9e-a95f-9e13dded323e'),
('Perfection is attained, not when there is nothing more to add, but when there is nothing left to take away.', 2, '8d0fd2b3-9ca7-4d9e-a95f-9e13dded323e');
insert into public.role_permissions (role, permission)
values
('admin', 'channels.delete'),
('admin', 'messages.delete'),
('moderator', 'messages.delete');
Keep your local Supabase instance running.
Generating Schemas¶
What makes the supabase-pydantic
package so powerful is its ability to translate schema changes quickly, while developing your data layer, into Pydantic schemas. To do this, you will need to install the package in any project, then run the generate
command:
$ cd /path/to/your/project # ... then activate your environment
$ pip install supabase-pydantic
$ sb-pydantic gen --type pydantic --framework fastapi --local
PostGres connection is open.
PostGres connection is closed.
Generating FastAPI Pydantic models...
FastAPI Pydantic models generated successfully: /path/to/your/project/entities/fastapi/schemas_latest.py
File formatted successfully: /path/to/your/project/entities/fastapi/schemas_latest.py
... and that's it. You now have a schemas.py
file in your project that contains the Pydantic models for the Slack database:
from __future__ import annotations
from datetime import datetime
from pydantic import UUID4, BaseModel, Field
############################## Custom Classes
# Note: This is a custom model class for defining common features among
# Pydantic Base Schema.
class CustomModel(BaseModel):
pass
############################## Base Classes
class ChannelsBaseSchema(CustomModel):
"""Channels Base Schema."""
# Primary Keys
id: int
# Columns
created_by: UUID4
inserted_at: datetime
slug: str
class MessagesBaseSchema(CustomModel):
"""Messages Base Schema."""
# Primary Keys
id: int
# Columns
channel_id: int
inserted_at: datetime
message: str | None = Field(default=None)
user_id: UUID4
class RolePermissionsBaseSchema(CustomModel):
"""RolePermissions Base Schema."""
# Primary Keys
id: int
# Columns
permission: str
role: str
class UserRolesBaseSchema(CustomModel):
"""UserRoles Base Schema."""
# Primary Keys
id: int
# Columns
role: str
user_id: UUID4
class UsersBaseSchema(CustomModel):
"""Users Base Schema."""
# Primary Keys
id: UUID4
# Columns
status: str | None = Field(default=None)
username: str | None = Field(default=None)
############################## Operational Classes
class Channels(ChannelsBaseSchema):
"""Channels Schema for Pydantic.
Inherits from ChannelsBaseSchema. Add any customization here.
"""
# Foreign Keys
users: list[Users] | None = Field(default=None)
messages: list[Messages] | None = Field(default=None)
class Messages(MessagesBaseSchema):
"""Messages Schema for Pydantic.
Inherits from MessagesBaseSchema. Add any customization here.
"""
# Foreign Keys
channels: list[Channels] | None = Field(default=None)
users: list[Users] | None = Field(default=None)
class RolePermissions(RolePermissionsBaseSchema):
"""RolePermissions Schema for Pydantic.
Inherits from RolePermissionsBaseSchema. Add any customization here.
"""
pass
class UserRoles(UserRolesBaseSchema):
"""UserRoles Schema for Pydantic.
Inherits from UserRolesBaseSchema. Add any customization here.
"""
# Foreign Keys
users: list[Users] | None = Field(default=None)
class Users(UsersBaseSchema):
"""Users Schema for Pydantic.
Inherits from UsersBaseSchema. Add any customization here.
"""
# Foreign Keys
channels: list[Channels] | None = Field(default=None)
messages: list[Messages] | None = Field(default=None)
user_roles: list[UserRoles] | None = Field(default=None)
Exploring the Schemas¶
The generated schemas are based on the tables and relationships defined in the Slack clone database. The BaseSchema
classes represent the base structure of each table, while the Operational
classes represent the operational schema that can be used in your FastAPI application. Use these schemas to interact with the database and build your FastAPI application.
Please note some of the following features which will be created in future releases:
- Generate all-null parent or base schemas for data manipulation in progeny BaseModels
- Create models for different ORM libraries (e.g. SQLAlchemy, Tortoise-ORM, etc.)
- Automatically generate CRUD operations for FastAPI applications and other frameworks
- Configuration options to prune and choose which tables to generate schemas for
- Support for more complex database structures and relationships
- Additional API related security measures and standard ORM methods
Conclusion¶
This example demonstrates how to generate FastAPI schemas from the Slack clone database using the supabase-pydantic
package. By following the steps outlined in this example, you can quickly generate Pydantic schemas for your FastAPI application and start building your data layer in an automated and efficient manner.