synpat-dataset / README.md
karan
Updated README
08b8124
metadata
tags:
  - synthetic-data
  - physics
  - symbolic-regression
  - scientific-discovery
  - physical-theories
license: mit

SynPAT: Generating Synthetic Physical Theories with Data

This is the Hugging Face dataset entry for SynPAT, a synthetic theory and data generation system developed for the paper:
SynPAT: Generating Synthetic Physical Theories with Data
GitHub: https://github.com/jlenchner/theorizer

SynPAT generates symbolic physical systems and corresponding synthetic data to benchmark symbolic regression and scientific discovery algorithms. Each synthetic system includes symbolic equations, dimensionally consistent variables/constants, algebraic consequences, and both clean and noisy datasets.


Dataset Structure

The dataset is organized hierarchically by configuration and system index:

dataset/{config}/System_{n}/

Where:

  • {v} ∈ {6, 7, 8, 9} is the number of symbolic variables
  • {d} ∈ {2, 3, 4} is the number of derivatives (among the symbolic variables)
  • {e} ∈ {4, 5, 6} is the number of equations in the system
  • {n} ∈ {1, 2, 3} indexes the individual system instances per configuration

Each System_{n} directory contains exactly 18 files:

Symbolic System and Consequence:

  • system.txt — base symbolic system with metadata including variable types, units, constants, and equations
  • consequence.txt — derived consequence equation (single polynomial) with associated metadata

Noiseless Data:

  • system.dat — clean numerical dataset corresponding to system.txt
  • consequence.dat — clean data for the consequence polynomial

Noisy Data:

For each noise level ε ∈ {0.001, 0.01, 0.05, 0.1}, the following files are included:

  • system_ε.dat — noisy system data with Gaussian noise of standard deviation ε
  • consequence_ε.dat — noisy consequence data with Gaussian noise of standard deviation ε

This results in:

  • 4 system_ε.dat files
  • 4 consequence_ε.dat files

Replacement Systems:

  • replacement_1.txt through replacement_5.txt — versions of system.txt where one axiom has been replaced, used for consequence prediction benchmarking under ablation.

Summary of File Count per System:

  • 2 base files: system.txt, consequence.txt
  • 2 clean data files: system.dat, consequence.dat
  • 8 noisy data files: system_ε.dat, consequence_ε.dat for 4 ε values
  • 5 replacement files: replacement_1.txt through replacement_5.txt

Total per system: 2 + 2 + 8 + 5 = 17 files

Each configuration folder (vars_{v}_derivs_{d}_eqns_{e}) contains exactly three systems: System_1, System_2, System_3.

The full dataset spans all 4 × 3 × 3 = 36 configurations, totaling 108 systems and 1836 files.

File Format Example

system.txt

system.txt contains the symbolic system, including variables, constants, derivatives, equations, and units:

Variables: ['Fc', 'Fg', 'W', 'd1', 'd2', 'm1']  
Constants: ['G', 'c']  
Derivatives: ['dx1dt', 'd2x1dt2', 'dx2dt']  
Equations:  
dx1dt*dx2dt*d2*m1 - W*d1  
d2x1dt2*W*m1 - Fc*Fg*d1  
2*c*Fg + dx2dt*Fc - dx2dt*Fg  
G*d2x1dt2*d2*m1 + G*Fc*d2 - G*W + 4*d2x1dt2*dx1dt^2*d1^2  
-d2x1dt2*Fc*m1 + d2x1dt2*Fg*m1 + Fc*Fg  
Units of Measure of Variables: ['s^(-2)*kg*m', 's^(-2)*kg*m', 'm^2*s^(-2)*kg', 'm', 'm', 'kg']  
Units of Measure of Constants: ['1/kg*m^3*s^(-2)', '1/s*m']  
Units of Measure of Derivatives: ['1/s*m', 's^(-2)*m', '1/s*m']  
Units of Measure of Equations:  
m^3*s^(-2)*kg  
kg^2*m^3*s^(-4)  
m^2*s^(-3)*kg  
m^5*s^(-4)  
kg^2*m^2*s^(-4)

consequence.txt

consequence.txt includes additional metadata for data generation and symbolic regression tasks:

Variables: ['Fc', 'Fg', 'W', 'd1', 'd2', 'm1']  
Constants: ['G', 'c']  
Derivatives: ['dx1dt', 'd2x1dt2', 'dx2dt']  
Equations:  
dx1dt*dx2dt*d2*m1 - W*d1  
d2x1dt2*W*m1 - Fc*Fg*d1  
2*c*Fg + dx2dt*Fc - dx2dt*Fg  
G*d2x1dt2*d2*m1 + G*Fc*d2 - G*W + 4*d2x1dt2*dx1dt^2*d1^2  
-d2x1dt2*Fc*m1 + d2x1dt2*Fg*m1 + Fc*Fg  
Units of Measure of Variables: ['s^(-2)*kg*m', 's^(-2)*kg*m', 'm^2*s^(-2)*kg', 'm', 'm', 'kg']  
Units of Measure of Constants: ['1/kg*m^3*s^(-2)', '1/s*m']  
Units of Measure of Derivatives: ['1/s*m', 's^(-2)*m', '1/s*m']  
Units of Measure of Equations:  
m^3*s^(-2)*kg  
kg^2*m^3*s^(-4)  
m^2*s^(-3)*kg  
m^5*s^(-4)  
kg^2*m^2*s^(-4)

Measured Variables: ['d1', 'Fg', 'm1', 'W']  
Observed Constants: []  
Measured Derivatives: ['d2x1dt2']  

Target Polynomial:  
d1*m1*d2x1dt2*Fg^2 - m1^2*d2x1dt2^2*W + m1*d2x1dt2*W*Fg

replacement_i.txt

replacement_i.txt has the same format as system.txt but with one axiom replaced by a different, dimensionally consistent axiom

Variables: ['Fc', 'Fg', 'W', 'd1', 'd2', 'm1']
Constants: ['G', 'c']
Derivatives: ['dx1dt', 'd2x1dt2', 'dx2dt']
Equations:
dx1dt*dx2dt*d2*m1 - W*d1
>>> c*Fc*d2 + c*Fg*d1 - dx1dt*Fg*d1 + dx2dt*Fg*d2 + dx2dt*W   <<< 
2*c*Fg + dx2dt*Fc - dx2dt*Fg
G*d2x1dt2*d2*m1 + G*Fc*d2 - G*W + 4*d2x1dt2*dx1dt^2*d1^2
-d2x1dt2*Fc*m1 + d2x1dt2*Fg*m1 + Fc*Fg
Units of Measure of Variables: ['s^(-2)*kg*m', 's^(-2)*kg*m', 'm^2*s^(-2)*kg', 'm', 'm', 'kg']
Units of Measure of Constants: ['1/kg*m^3*s^(-2)', '1/s*m']
Units of Measure of Derivatives: ['1/s*m', 's^(-2)*m', '1/s*m']
Units of Measure of Equations:
m^3*s^(-2)*kg
m^3*s^(-3)*kg
m^2*s^(-3)*kg
m^5*s^(-4)
kg^2*m^2*s^(-4)

Data Files

Each .dat file contains rows of numerical data corresponding to samples used for learning or evaluation. Each row represents one data point.

  • For consequence.dat and its noisy versions (e.g., consequence_0.1.dat), the columns correspond to the observed constants, followed by measured derivatives, and then measured variables, in the exact order as specified in consequence.txt.

Example row from consequence_0.1.dat:

1.925664644193098241e+00 2.872700594236812677e+00 ...
  • For system.dat and its noisy versions (e.g., system_0.1.dat), the first line is a header specifying the variable ordering, followed by rows of numerical data where each column matches the respective variable in the header.

Example header and data snippet from system_0.1.dat:

G c dx1dt d2x1dt2 dx2dt Fc W d1 Fg m1 d2
1.000000 1.0000000 ......

Authors


License

MIT License


For issues, questions, or contributions, please visit:
https://github.com/jlenchner/theorizer