zllm2 / config.json
prasenjeet099's picture
Upload folder using huggingface_hub
90b15e1 verified
raw
history blame contribute delete
213 Bytes
{"model_type": "zbrain", "hidden_size": 128, "num_hidden_layers": 2, "vocab_size": 10000, "max_position_embeddings": 512, "initializer_range": 0.02, "hidden_dropout_prob": 0.2, "attention_probs_dropout_prob": 0.1}