MySQL to Avro Converter

Transform MySQL database dumps into Apache Avro schema with automatic type detection

MySQL Input

Avro Schema Output

About MySQL to Avro Converter

Convert MySQL database dumps (CREATE TABLE and INSERT statements) to Apache Avro schema format with automatic type detection and data serialization. Perfect for big data applications using Hadoop, Kafka, and other Apache ecosystem tools.

Key Features

  • Automatic Type Detection: Intelligently detects int, long, double, boolean, and string types from MySQL data
  • Nullable Fields: All fields support null values for flexibility
  • Custom Schema Names: Configure schema name and namespace
  • Sample Data Generation: Optionally include JSON data for testing
  • Field Name Sanitization: Converts MySQL column names to valid Avro field names
  • Documentation: Preserves original column names in field docs
  • File Download: Save as .avsc schema file

How to Use

  1. Input MySQL Data: Paste your MySQL CREATE TABLE and INSERT statements or upload a .sql file
  2. Configure Schema: Set schema name and namespace
  3. Choose Options: Toggle sample data inclusion
  4. Review Output: The Avro schema generates automatically
  5. Copy or Download: Use the Copy or Download button to save your schema

Type Detection

  • int: Small integers (within ±2.1 billion)
  • long: Large integers or values outside int range
  • double: Decimal numbers (DECIMAL, FLOAT, DOUBLE)
  • boolean: true/false values
  • string: VARCHAR, TEXT, CHAR, and other text data
  • null: Empty or NULL values are treated as null

Example Conversion

MySQL Input:

CREATE TABLE employees (
  id INT PRIMARY KEY,
  name VARCHAR(100),
  age INT,
  salary DECIMAL(10,2)
);

INSERT INTO employees VALUES (1, 'John Doe', 28, 75000.00);
INSERT INTO employees VALUES (2, 'Jane Smith', 34, 82000.50);

Avro Schema Output:

{
  "type": "record",
  "name": "TableData",
  "namespace": "com.example",
  "doc": "Generated from MySQL database dump",
  "fields": [
    {
      "name": "id",
      "type": ["null", "int"],
      "doc": "id"
    },
    {
      "name": "name",
      "type": ["null", "string"],
      "doc": "name"
    },
    {
      "name": "age",
      "type": ["null", "int"],
      "doc": "age"
    },
    {
      "name": "salary",
      "type": ["null", "double"],
      "doc": "salary"
    }
  ]
}

Common Use Cases

  • Hadoop Integration: Define schemas for Hadoop data processing
  • Kafka Streaming: Create schemas for Kafka message serialization
  • Data Lakes: Structure MySQL data for Apache Parquet and ORC formats
  • ETL Pipelines: Define data contracts for MySQL to big data ETL processes
  • Database Migration: Convert MySQL schemas to Avro for cloud data warehouses
  • Data Archival: Archive MySQL data in efficient Avro binary format

About Apache Avro

Apache Avro is a data serialization system that provides rich data structures, a compact binary format, and schema evolution capabilities. It's widely used in big data ecosystems for efficient data storage and transmission, particularly with Hadoop, Kafka, and Spark.

Supported MySQL Data Types

  • Numeric: INT, BIGINT, DECIMAL, FLOAT, DOUBLE → int/long/double
  • String: VARCHAR, CHAR, TEXT, MEDIUMTEXT, LONGTEXT → string
  • Boolean: BOOLEAN, TINYINT(1) → boolean
  • NULL: NULL values → null in Avro union types

Privacy & Security

All conversions happen locally in your browser. Your MySQL data is never uploaded to any server, ensuring complete privacy and security.