uncased_L-12_H-128_A-2 / config.json
bansalaman18's picture
Upload config.json with huggingface_hub
26e1ee5 verified
{"hidden_size": 128, "hidden_act": "gelu", "initializer_range": 0.02, "vocab_size": 30522, "hidden_dropout_prob": 0.1, "num_attention_heads": 2, "type_vocab_size": 2, "max_position_embeddings": 512, "num_hidden_layers": 12, "intermediate_size": 512, "attention_probs_dropout_prob": 0.1}