MySQL to Avro Converter
Transform MySQL database dumps into Apache Avro schema with automatic type detection
MySQL Input
Convert MySQL to other formats
Avro Schema Output
Convert other formats to Avro
Related Tools
MySQL to BBCode
Convert MySQL database dumps to BBCode table format for forums
MySQL to CSV
Convert MySQL CREATE TABLE and INSERT statements to CSV format with customizable delimiters
MySQL to DAX
Convert MySQL database dumps to DAX table expressions for Power BI and Analysis Services
MySQL to Excel
Convert MySQL CREATE TABLE and INSERT statements to Excel XLSX format with formatting
MySQL to Firebase
Convert MySQL CREATE TABLE and INSERT statements to Firebase Realtime Database JSON structure
MySQL to HTML
Convert MySQL CREATE TABLE and INSERT statements to HTML table with styling options
About MySQL to Avro Converter
Convert MySQL database dumps (CREATE TABLE and INSERT statements) to Apache Avro schema format with automatic type detection and data serialization. Perfect for big data applications using Hadoop, Kafka, and other Apache ecosystem tools.
Key Features
- Automatic Type Detection: Intelligently detects int, long, double, boolean, and string types from MySQL data
- Nullable Fields: All fields support null values for flexibility
- Custom Schema Names: Configure schema name and namespace
- Sample Data Generation: Optionally include JSON data for testing
- Field Name Sanitization: Converts MySQL column names to valid Avro field names
- Documentation: Preserves original column names in field docs
- File Download: Save as .avsc schema file
How to Use
- Input MySQL Data: Paste your MySQL CREATE TABLE and INSERT statements or upload a .sql file
- Configure Schema: Set schema name and namespace
- Choose Options: Toggle sample data inclusion
- Review Output: The Avro schema generates automatically
- Copy or Download: Use the Copy or Download button to save your schema
Type Detection
- int: Small integers (within ±2.1 billion)
- long: Large integers or values outside int range
- double: Decimal numbers (DECIMAL, FLOAT, DOUBLE)
- boolean: true/false values
- string: VARCHAR, TEXT, CHAR, and other text data
- null: Empty or NULL values are treated as null
Example Conversion
MySQL Input:
CREATE TABLE employees ( id INT PRIMARY KEY, name VARCHAR(100), age INT, salary DECIMAL(10,2) ); INSERT INTO employees VALUES (1, 'John Doe', 28, 75000.00); INSERT INTO employees VALUES (2, 'Jane Smith', 34, 82000.50);
Avro Schema Output:
{
"type": "record",
"name": "TableData",
"namespace": "com.example",
"doc": "Generated from MySQL database dump",
"fields": [
{
"name": "id",
"type": ["null", "int"],
"doc": "id"
},
{
"name": "name",
"type": ["null", "string"],
"doc": "name"
},
{
"name": "age",
"type": ["null", "int"],
"doc": "age"
},
{
"name": "salary",
"type": ["null", "double"],
"doc": "salary"
}
]
} Common Use Cases
- Hadoop Integration: Define schemas for Hadoop data processing
- Kafka Streaming: Create schemas for Kafka message serialization
- Data Lakes: Structure MySQL data for Apache Parquet and ORC formats
- ETL Pipelines: Define data contracts for MySQL to big data ETL processes
- Database Migration: Convert MySQL schemas to Avro for cloud data warehouses
- Data Archival: Archive MySQL data in efficient Avro binary format
About Apache Avro
Apache Avro is a data serialization system that provides rich data structures, a compact binary format, and schema evolution capabilities. It's widely used in big data ecosystems for efficient data storage and transmission, particularly with Hadoop, Kafka, and Spark.
Supported MySQL Data Types
- Numeric: INT, BIGINT, DECIMAL, FLOAT, DOUBLE → int/long/double
- String: VARCHAR, CHAR, TEXT, MEDIUMTEXT, LONGTEXT → string
- Boolean: BOOLEAN, TINYINT(1) → boolean
- NULL: NULL values → null in Avro union types
FAQ
- Why are field types wrapped in ["null", "type"]?
This pattern defines a nullable field in Avro. It means the field can be either
nullor the specified type (int, long, double, boolean, or string). - What if my MySQL column names contain spaces or special characters?
The tool sanitizes column names to valid Avro field names by lowercasing and replacing invalid characters with underscores, while keeping the original name in the
docfield. - Can I use the generated schema with schema registries?
Yes. The output is standard Avro JSON schema and can be used with Kafka Schema Registry and other Avro-compatible systems after any organization-specific tweaks.
- How accurate is the automatic type detection?
The tool inspects sample values in each column to infer int, long, double, boolean, or string. For edge cases (mixed data), it falls back to string for safety.
- Is my SQL data uploaded anywhere during conversion?
No. All parsing and Avro generation is performed entirely in your browser. Your data is never sent to a server.
Privacy & Security
All conversions happen locally in your browser. Your MySQL data is never uploaded to any server, ensuring complete privacy and security.
