Skip to content
View flarco's full-sized avatar
Targeting
Targeting

Block or report flarco

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
flarco/README.md

Hi there 👋

I'm fritz. I am a technical data engineer / passionate technologist who loves tackling interesting problems with code 😀. Some of the projects I am most proud of are:

Sling (https://github.com/slingdata-io) -- Extract/Load into your database with ease from your shell terminal.

Sling is a passion project turned into a free CLI & SaaS Product which offers an easy solution to create and maintain high volume data pipelines using the Extract & Load (EL) approach. It focuses on data movement between:
  • Database to Database
  • File System to Database
  • Database to File System

Ever wanted to quickly pipe in a CSV or JSON file into your database? Use sling to do so:

cat my_file.csv | sling run --tgt-conn MYDB --tgt-object my_schema.my_table

Or want to copy data between two databases? Do it with sling:

sling run --src-conn PG_DB --src-stream public.transactions --tgt-conn MYSQL_DB --tgt-object mysql.bank_transactions --mode full-refresh

Sling can also easily manage our local connections with the sling conns command:

$ sling conns set MY_PG url='postgresql://postgres:myPassword@pghost:5432/postgres'

$ sling conns list
 -------------------------- ----------------- ------------------- 
| CONN NAME                | CONN TYPE       | SOURCE            |
 -------------------------- ----------------- ------------------- 
| AWS_S3                   | FileSys - S3    | sling env yaml    |
| FINANCE_BQ               | DB - BigQuery   | sling env yaml    |
| DO_SPACES                | FileSys - S3    | sling env yaml    |
| LOCALHOST_DEV            | DB - PostgreSQL | dbt profiles yaml |
| MSSQL                    | DB - SQLServer  | sling env yaml    |
| MYSQL                    | DB - MySQL      | sling env yaml    |
| ORACLE_DB                | DB - Oracle     | env variable      |
| MY_PG                    | DB - PostgreSQL | sling env yaml    |
 -------------------------- ----------------- ------------------- 

$ sling conns discover LOCALHOST_DEV
9:05AM INF Found 344 streams:
 - "public"."accounts"
 - "public"."bills"
 - "public"."connections"
 ...
dbREST (https://github.com/flarco/dbREST) -- Instantly spin up a REST API for any major database.

dbREST is basically an API backend that you can put in front of your database. Ever wanted to spin up an API service in front of your Snowflake, MySQL or even SQLite database? Well, dbREST allows that! Running dbrest serve will launch an API process which allow you to:

Select a table's data
GET /snowflake_db/my_schema/docker_logs?.columns=container_name,timestamp&.limit=100
[
  { "container_name": "vector", "timestamp": "2022-04-22T23:54:06.644268688Z" },
  { "container_name": "postgres", "timestamp": "2022-04-22T23:54:06.644315426Z" },
  { "container_name": "api", "timestamp": "2022-04-22T23:54:06.654821046Z" },
]
Insert into a table
POST /snowflake_db/my_schema/docker_logs

[
  {"container_name":"vector","host":"vector","image":"timberio/vector:0.21.1-debian","message":"2022-04-22T23:54:06.644214Z  INFO vector::sources::docker_logs: Capturing logs from now on. now=2022-04-22T23:54:06.644150817 00:00","stream":"stderr","timestamp":"2022-04-22T23:54:06.644268688Z"}
]
Update a table
PATCH /snowflake_db/my_schema/my_table?.key=col1

[
  { "col1": "123", "timestamp": "2022-04-22T23:54:06.644268688Z" },
  { "col1": "124", "timestamp": "2022-04-22T23:54:06.644315426Z" },
  { "col1": "125", "timestamp": "2022-04-22T23:54:06.654821046Z" }
]
Upsert into a table
PUT /snowflake_db/my_schema/my_table?.key=col1

[
  { "col1": "123", "timestamp": "2022-04-22T23:54:06.644268688Z" },
  { "col1": "124", "timestamp": "2022-04-22T23:54:06.644315426Z" },
  { "col1": "125", "timestamp": "2022-04-22T23:54:06.654821046Z" }
]
Submit a Custom SQL query
POST /snowflake_db/.sql

select * from my_schema.docker_logs where timestamp is not null
[
  { "container_name": "vector", "timestamp": "2022-04-22T23:54:06.644268688Z" },
  { "container_name": "postgres", "timestamp": "2022-04-22T23:54:06.644315426Z" },
  { "container_name": "api", "timestamp": "2022-04-22T23:54:06.654821046Z" },
]
List all columns in a table
GET /snowflake_db/my_schema/docker_logs/.columns
[
  {"column_id":1,"column_name":"timestamp", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},
  {"column_id":2,"column_name":"container_name", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},
  {"column_id":3,"column_name":"host", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},{"column_id":4,"column_name":"image", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},
]
List all tables in a schema
GET /snowflake_db/my_schema/.tables
[
  {"database_name":"default", "is_view":"table", "schema_name":"my_schema", "table_name":"docker_logs"},
  {"database_name":"default", "is_view":"table", "schema_name":"my_schema", "table_name":"example"},
  {"database_name":"default", "is_view":"view", "schema_name":"my_schema", "table_name":"place_vw"}
]
List all columns, in all tables in a schema
GET /snowflake_db/my_schema/.columns
[
  {"column_id":1,"column_name":"timestamp", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},
  {"column_id":2,"column_name":"container_name", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},
  {"column_id":3,"column_name":"host", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},{"column_id":4,"column_name":"image", "column_type":"String", "database_name":"default", "schema_name":"my_schema", "table_name":"docker_logs", "table_type":"table"},
]

Of course there must be an authentication / authorization logic. It is based on tokens being issued with the dbrest token sub-command which are tied to roles defined in a YAML config file:

reader:
  snowflake_db:
    allow_read:
      - schema1.*
      - schema2.table1
    allow_sql: 'disable'

  my_pg:
    allow_read:
      - '*'
    allow_sql: 'disable' 

writer:
  snowflake_db:
    allow_read:
      - schema1.*
      - schema2.table1
    allow_write:
      - schema2.table3
    allow_sql: 'disable'

  my_pg:
    allow_read:
      - '*'
    allow_write:
      - '*'
    allow_sql: 'any' 

We can now issue tokens with dbrest token issue <token_name> --roles reader,writer.

It is built in Go. And as you might have guessed, it also powers alot of dbNet :).

dbREST is in active developement. Here are some of the databases it connects to:

  • Clickhouse
  • Google BigQuery
  • Google BigTable
  • MySQL
  • Oracle
  • Redshift
  • PostgreSQL
  • SQLite
  • SQL Server
  • Snowflake
  • DuckDB (coming soon)
  • ScyllaDB (coming soon)
  • Firebolt (coming soon)
  • Databricks (coming soon)
dbNET (https://github.com/dbnet-io/dbnet) -- A database web client that lets your SQL superpowers shine.

dbNet is a web-based SQL IDE using Go as a backend, and your browser (or electron) as front-end. I built it because I was unsatisfied with the database clients out there. Alot of them are top-heavy, unituitive, slow or expensive. dbNet aims to be smart and useful especially for analysis and simply querying any SQL database.

The goal is to make it a great SQL IDE which gives useful context as you hover table and column names for example. It should allow you to ingest files with ease, imagine drag-dropping a CSV file into a schema where dbNet auto-creates the table with proper column types. The other nifty part is that it can run from a shell/terminal on any machine and lets users access the UI from the browser (with dbnet serve).

image

dbNet is in active developement and will be open-sourced soon. Here are some of the databases it connects to:

  • Clickhouse
  • Google BigQuery
  • Google BigTable
  • MySQL
  • Oracle
  • Redshift
  • PostgreSQL
  • SQLite
  • SQL Server
  • Snowflake
  • DuckDB (coming soon)
  • ScyllaDB (coming soon)
  • Firebolt (coming soon)
  • Databricks (coming soon)

counter

Pinned Loading

  1. slingdata-io/sling-cli slingdata-io/sling-cli Public

    Sling is a CLI tool that extracts data from a source storage/database and loads it in a target storage/database.

    Go 433 35

  2. dbnet-io/dbnet dbnet-io/dbnet Public

    dbNet is a web-based database client using Go as a backend, and React as front-end

    TypeScript 17 1

  3. dbrest-io/dbREST dbrest-io/dbREST Public

    Spin up a REST API for any Major Database

    Go 33 2

  4. slingdata-io/sling-python slingdata-io/sling-python Public

    Python wrapper for the Sling CLI tool

    Python 42 8

  5. dbio dbio Public archive

    Go 9 1

  6. elk-tcpdump elk-tcpdump Public

    Code to aggregate tcpdump traffic and send to ELK (Elasticsearch-Logstach-Kibana)

    Python 18 7