Problem Statement-¶
LLMs does great role in AI but needed GPUs and heavier computation which is not only costly based on the money but also it creates blocker for democratisation of AI which actually contribute towards the rapid growth of open source AI. Therefore There is a need of system where One can create a billion parameter model or 100 Billion Parameter model or 1 Trillion Parameter Model without the need of GPU or massive RAM or computational resources like with 16GB RAM and i5 processor, one should able to run that model. Thus Kratim Budhimata model provides that key requirement where you can create n number of models inside a single model which provides feasible management of large number of models and you can train and predict selective models which actually contribute for the prediction for particular user query, this way there is no way to even load all the model into RAM atleast one time. One don't need to load full model anytime that's the magic of this solution. you can run this multimodel system and just use a classifier to predict which model should be used to answer that query after that you can load only that model. So it can be horizontally scaled rather than vertically meaning instead of having one big model you can have thousands of small models which actually provides better results and solve the hallucination problem also.
Solution¶
- Create a Kratim Budhimata model class and initalise it and call it first model
- Train the classification models on the prompts and create the labels based on model number or dataset number. let's say 1 for text summarisation model in below example
- Save the weights of classification model and summarisation model
- Initalise another instance of Kratim Budhimata model class and call it second model
- Load the trained weight of first model's classification model and summarisation model into Second model's classification
- Predict the class using classification model
- Based on that class, Initalize and load the relevent model to predict the response in our case its summarisation model.
- One can assign none value after saving the weight to the first model for saving resources in production environment or whereever required.
Import¶
import tensorflow as kratim_budhimata_tf
import numpy as np
import pandas as pd
import keras
from sklearn.metrics import accuracy_score
from duckduckgo_search import DDGS
Load the Data¶
prompts = [
"Kratim Budhimata is evolving the AI field by Long term innovations which can make real difference.",
"When it comes to better results in low cost one should connect with Kratim Budhimata.",
"If one needs best results with low computation limitation then Kratim Budhimata is best place to reach out.",
"Weather is very unpredictible in most of the areas now a days mostly in rainy seasons.",
"Relationship is personal thing which needs to be respected for privacy.",
"Global warming is one of the areas where the world should look into it.",
"Best time to do the Great thing is now",
"Bravery cannot be replaced by anything which should be present",
"Food and Health is basic requirement for human being which should be fulfilled for better world",
"Direction matters more than speed when it comes to take critical decisions."
]
responses = [
"Kratim Budhimata is evolving AI by cutting edge Innovations",
"Kratim Budhimata is building innovative solutions cost effectively",
"Low compute and more results meaning Kratim Budhimata",
"Weather remains unpredictible when rains",
"Relationship is private thing which should be respected",
"Global warming should be addressed",
"Always do the best thing first.",
"Bravery matters when going gets tough",
"Food and Health is need and its not a priviledge",
"Good direction often leads to the better places"
]
labels = [1, 1, 1, 1, 1, 0, 0, 0, 0, 0]
Data Preprocessing¶
tokenizer = kratim_budhimata_tf.keras.preprocessing.text.Tokenizer(filters='')
tokenizer.fit_on_texts(prompts + responses)
vocab_size = len(tokenizer.word_index) + 1
max_prompt_len = max(len(p.split()) for p in prompts)
max_summary_len = max(len(r.split()) for r in responses)
def encoding_function(texts, max_len):
seqs = tokenizer.texts_to_sequences(texts)
return kratim_budhimata_tf.keras.preprocessing.sequence.pad_sequences(seqs, maxlen=max_len, padding='post')
prompts_x = encoding_function(prompts, max_prompt_len)
summary_y = encoding_function(responses, max_summary_len)
summary_onehot_y = kratim_budhimata_tf.keras.utils.to_categorical(summary_y, num_classes=vocab_size)
labels_y = kratim_budhimata_tf.keras.utils.to_categorical(labels, num_classes=2)
embed_dim = 64
lstm_units = 128
Model Creation¶
@keras.src.saving.register_keras_serializable()
class KratimBudhimataModel(kratim_budhimata_tf.keras.Model):
def __init__(self, model_num=1, model_type="text_summarisation", use_search=False, max_prompt_len=256, max_summary_len=256, lstm_units=128, num_classes=2, vocab_size=1500, embed_dim=128, model_num_list=[1]):
super().__init__()
self.model_num=None
self.model_type=None
self.use_search=False
self.max_prompt_len=256
self.max_summary_len=256
self.lstm_units=128
self.num_classes=2
self.vocab_size = 1000
self.embed_dim = 128
self.model_paths = {}
self.optimizers={}
self.compiled_ids=set()
self.trainable = True
def model_register(self, model_num, model_type, use_search):
if model_num not in self.model_paths and use_search==False:
if model_type=="text_classification":
print(f"Registered Model Num: {model_num} ")
self.input_ids = kratim_budhimata_tf.keras.Input(shape=(self.max_prompt_len,), name=f'input_ids_{model_num}')
self.embedding_task_number = kratim_budhimata_tf.keras.layers.Embedding(input_dim=self.vocab_size, output_dim=self.embed_dim, name=f'embedding_task_number_{model_num}')(self.input_ids)
self.lstm_layer = kratim_budhimata_tf.keras.layers.LSTM(self.lstm_units, name=f'LSTM_Layer_{model_num}')(self.embedding_task_number)
self.classification_outcome = kratim_budhimata_tf.keras.layers.Dense(self.num_classes, activation='softmax', name=f'classifier_dense_{model_num}')(self.lstm_layer)
self.model = kratim_budhimata_tf.keras.models.Model(inputs=self.input_ids, outputs=self.classification_outcome, name=f"classification_model_{model_num}")
self.model_paths[model_num]=self.model
self.optimizers[model_num]=kratim_budhimata_tf.keras.optimizers.Adam(learning_rate=0.002)
elif model_type=="text_summarisation":
print(f"Registered Model Num: {model_num} ")
self.input_ids = kratim_budhimata_tf.keras.Input(shape=(self.max_prompt_len,), name=f'input_ids_{model_num}')
self.embedding_task_number = kratim_budhimata_tf.keras.layers.Embedding(input_dim=self.vocab_size, output_dim=self.embed_dim, name=f'embedding_task_number_{model_num}')(self.input_ids)
self.encode_lstm_layer = kratim_budhimata_tf.keras.layers.LSTM(self.lstm_units, name=f'LSTM_{model_num}')(self.embedding_task_number)
self.repeat_vector = kratim_budhimata_tf.keras.layers.RepeatVector(self.max_summary_len, name=f'Repeat_Vector_{model_num}')(self.encode_lstm_layer)
self.decode_lstm_layer = kratim_budhimata_tf.keras.layers.LSTM(lstm_units, return_sequences=True, name=f'decode_LSTM_{model_num}')(self.repeat_vector)
self.summary_outcome = kratim_budhimata_tf.keras.layers.Dense(self.vocab_size, activation='softmax', name=f'summary_dense_{model_num}')(self.decode_lstm_layer)
self.model = kratim_budhimata_tf.keras.models.Model(inputs=self.input_ids, outputs=self.summary_outcome, name=f"summary_model_{model_num}")
self.model_paths[model_num]=self.model
self.optimizers[model_num]=kratim_budhimata_tf.keras.optimizers.Adam(learning_rate=0.002)
elif model_type=="text_to_image":
print(f"Registered Model Num: {model_num} ")
self.input_ids = kratim_budhimata_tf.keras.Input(shape=(None,), dtype='int32', name=f'input_ids_{model_num}')
self.embedding_task_number = kratim_budhimata_tf.keras.layers.Embedding(input_dim=self.vocab_size, output_dim=self.embed_dim, name=f'embedding_task_number_{model_num}')(self.input_ids)
self.text_outcome = kratim_budhimata_tf.keras.layers.Dense(self.vocab_size, activation='softmax', name=f'text_output_{model_num}')(self.embedding_task_number)
self.image=kratim_budhimata_tf.keras.layers.Dense(224*224*3, activation='sigmoid', name=f"image_dense_{model_num}")(self.text_outcome)
self.image_outcome=kratim_budhimata_tf.keras.layers.Reshape((224, 224, 3), name='outcome_image')(self.image)
self.model = kratim_budhimata_tf.keras.models.Model(inputs=self.input_ids, outputs=self.image_outcome, name=f"model_{model_num}")
self.model_paths[model_num]=self.model
self.optimizers[model_num]=kratim_budhimata_tf.keras.optimizers.Adam(learning_rate=0.002)
elif model_type=="text_to_video":
print(f"Registered Model Num: {model_num} ")
self.input_ids = kratim_budhimata_tf.keras.Input(shape=(None,), dtype='int32', name=f'input_ids_{model_num}')
self.embedding_task_number = kratim_budhimata_tf.keras.layers.Embedding(input_dim=self.vocab_size, output_dim=self.embed_dim, name=f'embedding_task_number_{model_num}')(self.input_ids)
self.text_outcome = kratim_budhimata_tf.keras.layers.Dense(self.vocab_size, activation='softmax', name=f'text_output_{model_num}')(self.embedding_task_number)
self.video=kratim_budhimata_tf.keras.layers.Dense(16*112*112*3, activation='sigmoid', name=f"image_dense_{model_num}")(self.text_outcome)
self.video_outcome=kratim_budhimata_tf.keras.layers.Reshape((16, 112, 112, 3), name='outcome_video')(self.video)
self.model = kratim_budhimata_tf.keras.models.Model(inputs=self.input_ids, outputs=self.video_outcome, name=f"model_{model_num}")
self.model_paths[model_num]=self.model
self.optimizers[model_num]=kratim_budhimata_tf.keras.optimizers.Adam(learning_rate=0.002)
elif model_type=="text_to_image_and_video":
print(f"Registered Model Num: {model_num} ")
self.input_ids = kratim_budhimata_tf.keras.Input(shape=(None,), dtype='int32', name=f'input_ids_{model_num}')
self.embedding_task_number = kratim_budhimata_tf.keras.layers.Embedding(input_dim=self.vocab_size, output_dim=self.embed_dim, name=f'embedding_task_number_{model_num}')(self.input_ids)
self.text_outcome = kratim_budhimata_tf.keras.layers.Dense(self.vocab_size, activation='softmax', name=f'text_output_{model_num}')(self.embedding_task_number)
self.image=kratim_budhimata_tf.keras.layers.Dense(224*224*3, activation='sigmoid', name=f"image_dense_{model_num}")(self.text_outcome)
self.image_outcome=kratim_budhimata_tf.keras.layers.Reshape((224, 224, 3), name='outcome_image')(self.image)
self.video=kratim_budhimata_tf.keras.layers.Dense(16*112*112*3, activation='sigmoid', name=f"image_dense_{model_num}")(self.text_outcome)
self.video_outcome=kratim_budhimata_tf.keras.layers.Reshape((16, 112, 112, 3), name='outcome_video')(self.video)
self.model = kratim_budhimata_tf.keras.models.Model(inputs=self.input_ids, outputs=[self.image_outcome, self.video_outcome], name=f"model_{model_num}")
self.model_paths[model_num]=self.model
self.optimizers[model_num]=kratim_budhimata_tf.keras.optimizers.Adam(learning_rate=0.002)
elif model_type=="text_to_text_and_image_and_video":
print(f"Registered Model Num: {model_num} ")
self.input_ids = kratim_budhimata_tf.keras.Input(shape=(None,), dtype='int32', name=f'input_ids_{model_num}')
self.embedding_task_number = kratim_budhimata_tf.keras.layers.Embedding(input_dim=self.vocab_size, output_dim=self.embed_dim, name=f'embedding_task_number_{model_num}')(self.input_ids)
self.text_outcome = kratim_budhimata_tf.keras.layers.Dense(self.vocab_size, activation='softmax', name=f'text_output_{model_num}')(self.embedding_task_number)
self.image=kratim_budhimata_tf.keras.layers.Dense(224*224*3, activation='sigmoid', name=f"image_dense_{model_num}")(self.text_outcome)
self.image_outcome=kratim_budhimata_tf.keras.layers.Reshape((224, 224, 3), name='outcome_image')(self.image)
self.video=kratim_budhimata_tf.keras.layers.Dense(16*112*112*3, activation='sigmoid', name=f"image_dense_{model_num}")(self.text_outcome)
self.video_outcome=kratim_budhimata_tf.keras.layers.Reshape((16, 112, 112, 3), name='outcome_video')(self.video)
self.model = kratim_budhimata_tf.keras.models.Model(inputs=self.input_ids, outputs=[self.text_outcome, self.image_outcome, self.video_outcome], name=f"model_{model_num}")
self.model_paths[model_num]=self.model
self.optimizers[model_num]=kratim_budhimata_tf.keras.optimizers.Adam(learning_rate=0.002)
else:
print(f"Registered Model Num: {model_num} ")
self.input_ids = kratim_budhimata_tf.keras.Input(shape=(None,), dtype='int32', name=f'input_ids_{model_num}')
self.embedding_task_number = kratim_budhimata_tf.keras.layers.Embedding(input_dim=self.vocab_size, output_dim=self.embed_dim, name=f'embedding_task_number_{model_num}')(self.input_ids)
self.text_outcome = kratim_budhimata_tf.keras.layers.Dense(self.vocab_size, activation='softmax', name=f'text_output_{model_num}')(self.embedding_task_number)
self.model = kratim_budhimata_tf.keras.models.Model(inputs=self.input_ids, outputs=self.text_outcome, name=f"model_{model_num}")
self.model_paths[model_num]=self.model
self.optimizers[model_num]=kratim_budhimata_tf.keras.optimizers.Adam(learning_rate=0.002)
if use_search==True:
with DDGS() as ddgs:
results = ddgs.text(query)
return results[0]['body'] if results else "No response found from search."
def model_creation_process(self, model_num, model_type, use_search):
self.model_register(model_num, model_type, use_search)
self.model_num=model_num
m_path=self.model_paths.get(model_num)
if self.model_num not in self.compiled_ids:
self.compile(
optimizer=self.optimizers[model_num],
loss={
'classification_outcome': 'categorical_crossentropy',
'summary_outcome': 'categorical_crossentropy',
'text_outcome': 'sparse_categorical_crossentropy',
'image_output': 'mse',
'video_output': 'mse'
},
metrics=['accuracy', 'accuracy','accuracy','accuracy','accuracy']
)
print(self.optimizers[model_num])
print(f"compilation done for {model_num}")
self.compiled_ids.add(model_num)
def call(self, inputs, training=False):
if self.model_num is None:
print("please provdie model number.")
else:
return self.model_paths[self.model_num]
def get_config(self):
return {
"model_num": self.model_num,
#"model_paths":self.model_paths
}
@classmethod
def from_config(cls, config):
return cls(**config)
Model Initialization¶
first_model = KratimBudhimataModel()
first_model.use_search=False
first_model.num_classes=2
first_model.vocab_size=vocab_size
first_model.embed_dim=embed_dim
first_model.lstm_units=lstm_units
first_model.max_prompt_len=max_prompt_len
first_model.max_summary_len=max_summary_len
Model Number 1 is for classification and 2 is for Summarisation. Here Model Number is created because in case of multiple models we can load the relvent model which can answer the particular user query by first using classification model to classify the label which is model number then use that label to load the relevent model number model to predict the response for that query in that way two steps would be inculded in prediction.¶
first_model.model_creation_process(1, model_type="text_classification", use_search=False)
Registered Model Num: 1 <keras.src.optimizers.adam.Adam object at 0x0000019E646255B0> compilation done for 1
first_model.model_creation_process(2, model_type="text_summarisation", use_search=False)
first_model.summary()
Registered Model Num: 2 <keras.src.optimizers.adam.Adam object at 0x0000019E64CD4E90> compilation done for 2
Model: "kratim_budhimata_model"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ classification_model_1 │ (None, 2) │ 106,434 │ │ (Functional) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ summary_model_2 (Functional) │ (None, 10, 115) │ 252,595 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 359,029 (1.37 MB)
Trainable params: 359,029 (1.37 MB)
Non-trainable params: 0 (0.00 B)
Model Exploration¶
first_model.layers
[<Functional name=classification_model_1, built=True>, <Functional name=summary_model_2, built=True>]
Classification Model¶
model_classification=first_model.layers[0]
model_classification
<Functional name=classification_model_1, built=True>
Summarisation Model¶
model_summary=first_model.layers[1]
model_summary
<Functional name=summary_model_2, built=True>
model_classification.compile(
optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy']
)
model_summary.compile(
optimizer='adam',
loss='categorical_crossentropy',
metrics=['accuracy']
)
Classification Model Training¶
model_classification.fit(prompts_x, labels_y, epochs=200)
Epoch 1/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 6s 6s/step - accuracy: 0.5000 - loss: 0.6946 Epoch 2/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.5000 - loss: 0.6889 Epoch 3/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.6000 - loss: 0.6831 Epoch 4/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - accuracy: 0.6000 - loss: 0.6766 Epoch 5/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 0.7000 - loss: 0.6688 Epoch 6/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.7000 - loss: 0.6592 Epoch 7/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 0.8000 - loss: 0.6470 Epoch 8/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.9000 - loss: 0.6315 Epoch 9/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 0.9000 - loss: 0.6116 Epoch 10/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 0.9000 - loss: 0.5858 Epoch 11/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 0.9000 - loss: 0.5524 Epoch 12/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 0.9000 - loss: 0.5096 Epoch 13/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.9000 - loss: 0.4550 Epoch 14/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 0.9000 - loss: 0.3861 Epoch 15/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 1.0000 - loss: 0.3032 Epoch 16/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 1.0000 - loss: 0.2113 Epoch 17/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 1.0000 - loss: 0.1221 Epoch 18/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 1.0000 - loss: 0.0533 Epoch 19/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 1.0000 - loss: 0.0189 Epoch 20/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 1.0000 - loss: 0.0070 Epoch 21/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 0.0030 Epoch 22/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 1.0000 - loss: 0.0016 Epoch 23/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 1.0000 - loss: 9.5476e-04 Epoch 24/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 1.0000 - loss: 6.4033e-04 Epoch 25/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 4.5980e-04 Epoch 26/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 1.0000 - loss: 3.4602e-04 Epoch 27/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 1.0000 - loss: 2.6931e-04 Epoch 28/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 1.0000 - loss: 2.1507e-04 Epoch 29/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 1.0000 - loss: 1.7528e-04 Epoch 30/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 1.0000 - loss: 1.4534e-04 Epoch 31/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 1.2231e-04 Epoch 32/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 1.0425e-04 Epoch 33/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step - accuracy: 1.0000 - loss: 8.9960e-05 Epoch 34/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 7.8470e-05 Epoch 35/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 1.0000 - loss: 6.9113e-05 Epoch 36/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 6.1473e-05 Epoch 37/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 5.5132e-05 Epoch 38/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 4.9839e-05 Epoch 39/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 4.5393e-05 Epoch 40/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 42ms/step - accuracy: 1.0000 - loss: 4.1650e-05 Epoch 41/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 3.8444e-05 Epoch 42/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 1.0000 - loss: 3.5714e-05 Epoch 43/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 1.0000 - loss: 3.3342e-05 Epoch 44/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 1.0000 - loss: 3.1304e-05 Epoch 45/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 1.0000 - loss: 2.9527e-05 Epoch 46/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 2.7978e-05 Epoch 47/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 2.6583e-05 Epoch 48/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 2.5403e-05 Epoch 49/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 2.4306e-05 Epoch 50/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 1.0000 - loss: 2.3341e-05 Epoch 51/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 1.0000 - loss: 2.2494e-05 Epoch 52/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 1.0000 - loss: 2.1708e-05 Epoch 53/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 2.1028e-05 Epoch 54/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 2.0408e-05 Epoch 55/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 1.0000 - loss: 1.9824e-05 Epoch 56/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 1.9312e-05 Epoch 57/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 1.0000 - loss: 1.8835e-05 Epoch 58/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 1.0000 - loss: 1.8406e-05 Epoch 59/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 1.8000e-05 Epoch 60/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 1.0000 - loss: 1.7631e-05 Epoch 61/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 1.7285e-05 Epoch 62/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 1.6963e-05 Epoch 63/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 1.6689e-05 Epoch 64/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 1.6415e-05 Epoch 65/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 1.6165e-05 Epoch 66/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 1.0000 - loss: 1.5902e-05 Epoch 67/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 1.5700e-05 Epoch 68/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 1.5473e-05 Epoch 69/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 1.0000 - loss: 1.5271e-05 Epoch 70/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 1.0000 - loss: 1.5056e-05 Epoch 71/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 1.0000 - loss: 1.4889e-05 Epoch 72/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 1.4722e-05 Epoch 73/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 1.4555e-05 Epoch 74/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 1.0000 - loss: 1.4412e-05 Epoch 75/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 1.4257e-05 Epoch 76/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 1.0000 - loss: 1.4090e-05 Epoch 77/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 1.3959e-05 Epoch 78/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 1.3828e-05 Epoch 79/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 1.3697e-05 Epoch 80/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 1.3578e-05 Epoch 81/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 1.3459e-05 Epoch 82/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 1.0000 - loss: 1.3351e-05 Epoch 83/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 1.3220e-05 Epoch 84/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 1.0000 - loss: 1.3089e-05 Epoch 85/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 1.0000 - loss: 1.2994e-05 Epoch 86/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 1.0000 - loss: 1.2910e-05 Epoch 87/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 1.0000 - loss: 1.2803e-05 Epoch 88/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 1.0000 - loss: 1.2696e-05 Epoch 89/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 41ms/step - accuracy: 1.0000 - loss: 1.2612e-05 Epoch 90/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 1.0000 - loss: 1.2505e-05 Epoch 91/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 40ms/step - accuracy: 1.0000 - loss: 1.2410e-05 Epoch 92/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 1.2314e-05 Epoch 93/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 36ms/step - accuracy: 1.0000 - loss: 1.2243e-05 Epoch 94/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 39ms/step - accuracy: 1.0000 - loss: 1.2147e-05 Epoch 95/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 1.0000 - loss: 1.2052e-05 Epoch 96/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 1.1980e-05 Epoch 97/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 1.1897e-05 Epoch 98/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 1.1814e-05 Epoch 99/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 32ms/step - accuracy: 1.0000 - loss: 1.1718e-05 Epoch 100/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 1.1647e-05 Epoch 101/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 1.1587e-05 Epoch 102/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 1.1516e-05 Epoch 103/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 1.1420e-05 Epoch 104/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 1.0000 - loss: 1.1337e-05 Epoch 105/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - accuracy: 1.0000 - loss: 1.1277e-05 Epoch 106/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 1.0000 - loss: 1.1194e-05 Epoch 107/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - accuracy: 1.0000 - loss: 1.1110e-05 Epoch 108/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 1.0000 - loss: 1.1051e-05 Epoch 109/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 1.0000 - loss: 1.0991e-05 Epoch 110/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - accuracy: 1.0000 - loss: 1.0920e-05 Epoch 111/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 91ms/step - accuracy: 1.0000 - loss: 1.0860e-05 Epoch 112/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - accuracy: 1.0000 - loss: 1.0788e-05 Epoch 113/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 1.0000 - loss: 1.0717e-05 Epoch 114/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 1.0000 - loss: 1.0645e-05 Epoch 115/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 1.0000 - loss: 1.0598e-05 Epoch 116/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 1.0000 - loss: 1.0550e-05 Epoch 117/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 1.0478e-05 Epoch 118/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 1.0000 - loss: 1.0407e-05 Epoch 119/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 1.0347e-05 Epoch 120/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 1.0000 - loss: 1.0276e-05 Epoch 121/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step - accuracy: 1.0000 - loss: 1.0216e-05 Epoch 122/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 1.0000 - loss: 1.0157e-05 Epoch 123/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - accuracy: 1.0000 - loss: 1.0097e-05 Epoch 124/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 1.0000 - loss: 1.0049e-05 Epoch 125/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 1.0000 - loss: 9.9897e-06 Epoch 126/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 1.0000 - loss: 9.9182e-06 Epoch 127/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 1.0000 - loss: 9.8586e-06 Epoch 128/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 1.0000 - loss: 9.7989e-06 Epoch 129/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 1.0000 - loss: 9.7632e-06 Epoch 130/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step - accuracy: 1.0000 - loss: 9.7155e-06 Epoch 131/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 1.0000 - loss: 9.6559e-06 Epoch 132/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - accuracy: 1.0000 - loss: 9.5844e-06 Epoch 133/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - accuracy: 1.0000 - loss: 9.5486e-06 Epoch 134/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 1.0000 - loss: 9.4890e-06 Epoch 135/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 1.0000 - loss: 9.4413e-06 Epoch 136/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step - accuracy: 1.0000 - loss: 9.3936e-06 Epoch 137/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 1.0000 - loss: 9.3460e-06 Epoch 138/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 1.0000 - loss: 9.2744e-06 Epoch 139/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 1.0000 - loss: 9.2387e-06 Epoch 140/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 1.0000 - loss: 9.1791e-06 Epoch 141/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 1.0000 - loss: 9.1314e-06 Epoch 142/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 1.0000 - loss: 9.0956e-06 Epoch 143/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 1.0000 - loss: 9.0479e-06 Epoch 144/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 1.0000 - loss: 9.0003e-06 Epoch 145/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 1.0000 - loss: 8.9407e-06 Epoch 146/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 1.0000 - loss: 8.8930e-06 Epoch 147/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 1.0000 - loss: 8.8572e-06 Epoch 148/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 8.8095e-06 Epoch 149/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 1.0000 - loss: 8.7618e-06 Epoch 150/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 1.0000 - loss: 8.7142e-06 Epoch 151/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 1.0000 - loss: 8.6665e-06 Epoch 152/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 1.0000 - loss: 8.6426e-06 Epoch 153/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 1.0000 - loss: 8.5949e-06 Epoch 154/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 1.0000 - loss: 8.5473e-06 Epoch 155/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 1.0000 - loss: 8.5115e-06 Epoch 156/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 1.0000 - loss: 8.4519e-06 Epoch 157/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step - accuracy: 1.0000 - loss: 8.4042e-06 Epoch 158/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 1.0000 - loss: 8.3685e-06 Epoch 159/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 1.0000 - loss: 8.3208e-06 Epoch 160/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 1.0000 - loss: 8.2969e-06 Epoch 161/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 1.0000 - loss: 8.2612e-06 Epoch 162/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 1.0000 - loss: 8.2016e-06 Epoch 163/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 1.0000 - loss: 8.1658e-06 Epoch 164/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 8.1300e-06 Epoch 165/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 1.0000 - loss: 8.0943e-06 Epoch 166/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 8.0585e-06 Epoch 167/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 1.0000 - loss: 8.0227e-06 Epoch 168/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 7.9631e-06 Epoch 169/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 1.0000 - loss: 7.9274e-06 Epoch 170/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 7.9035e-06 Epoch 171/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 1.0000 - loss: 7.8678e-06 Epoch 172/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 1.0000 - loss: 7.8439e-06 Epoch 173/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 1.0000 - loss: 7.8082e-06 Epoch 174/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 1.0000 - loss: 7.7367e-06 Epoch 175/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step - accuracy: 1.0000 - loss: 7.7009e-06 Epoch 176/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 1.0000 - loss: 7.6770e-06 Epoch 177/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 1.0000 - loss: 7.6294e-06 Epoch 178/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 7.6174e-06 Epoch 179/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 29ms/step - accuracy: 1.0000 - loss: 7.5817e-06 Epoch 180/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step - accuracy: 1.0000 - loss: 7.5340e-06 Epoch 181/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 1.0000 - loss: 7.4982e-06 Epoch 182/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 1.0000 - loss: 7.4744e-06 Epoch 183/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 1.0000 - loss: 7.4386e-06 Epoch 184/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 7.3909e-06 Epoch 185/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 1.0000 - loss: 7.3790e-06 Epoch 186/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 109ms/step - accuracy: 1.0000 - loss: 7.3313e-06 Epoch 187/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 101ms/step - accuracy: 1.0000 - loss: 7.2956e-06 Epoch 188/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 1.0000 - loss: 7.2598e-06 Epoch 189/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 104ms/step - accuracy: 1.0000 - loss: 7.2360e-06 Epoch 190/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 1.0000 - loss: 7.2121e-06 Epoch 191/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 1.0000 - loss: 7.2002e-06 Epoch 192/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step - accuracy: 1.0000 - loss: 7.1525e-06 Epoch 193/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 1.0000 - loss: 7.1168e-06 Epoch 194/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 1.0000 - loss: 7.0810e-06 Epoch 195/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 7.0452e-06 Epoch 196/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 7.0333e-06 Epoch 197/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 6.9856e-06 Epoch 198/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 1.0000 - loss: 6.9618e-06 Epoch 199/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 1.0000 - loss: 6.9141e-06 Epoch 200/200 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 1.0000 - loss: 6.8903e-06
<keras.src.callbacks.history.History at 0x19e66437710>
Summarisation Model Training¶
model_summary.fit(prompts_x, summary_onehot_y, epochs=300)
Epoch 1/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 8s 8s/step - accuracy: 0.0000e+00 - loss: 4.7442 Epoch 2/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.2700 - loss: 4.7297 Epoch 3/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.2700 - loss: 4.7127 Epoch 4/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.2700 - loss: 4.6894 Epoch 5/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.2700 - loss: 4.6551 Epoch 6/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.2700 - loss: 4.6028 Epoch 7/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 100ms/step - accuracy: 0.2700 - loss: 4.5214 Epoch 8/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 108ms/step - accuracy: 0.2700 - loss: 4.3934 Epoch 9/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step - accuracy: 0.2700 - loss: 4.1972 Epoch 10/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.2700 - loss: 3.9316 Epoch 11/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.2700 - loss: 3.6841 Epoch 12/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 0.2700 - loss: 3.6474 Epoch 13/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - accuracy: 0.2700 - loss: 3.7559 Epoch 14/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 90ms/step - accuracy: 0.2700 - loss: 3.7583 Epoch 15/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 0.2700 - loss: 3.6827 Epoch 16/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 0.2700 - loss: 3.5919 Epoch 17/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 0.2700 - loss: 3.5254 Epoch 18/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 0.2700 - loss: 3.4948 Epoch 19/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.2700 - loss: 3.4898 Epoch 20/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - accuracy: 0.2700 - loss: 3.4930 Epoch 21/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step - accuracy: 0.2700 - loss: 3.4913 Epoch 22/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.2700 - loss: 3.4793 Epoch 23/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 0.2700 - loss: 3.4573 Epoch 24/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.2700 - loss: 3.4288 Epoch 25/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 0.2700 - loss: 3.3988 Epoch 26/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 100ms/step - accuracy: 0.2700 - loss: 3.3719 Epoch 27/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 92ms/step - accuracy: 0.2700 - loss: 3.3513 Epoch 28/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 0.2700 - loss: 3.3375 Epoch 29/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 0.2700 - loss: 3.3289 Epoch 30/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 100ms/step - accuracy: 0.2700 - loss: 3.3220 Epoch 31/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 0.2700 - loss: 3.3133 Epoch 32/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 0.2700 - loss: 3.3006 Epoch 33/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.2700 - loss: 3.2835 Epoch 34/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 0.2700 - loss: 3.2633 Epoch 35/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 0.2700 - loss: 3.2429 Epoch 36/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - accuracy: 0.2700 - loss: 3.2249 Epoch 37/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 0.2800 - loss: 3.2111 Epoch 38/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.2800 - loss: 3.2009 Epoch 39/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 90ms/step - accuracy: 0.2800 - loss: 3.1915 Epoch 40/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 0.3000 - loss: 3.1788 Epoch 41/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 0.3000 - loss: 3.1606 Epoch 42/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step - accuracy: 0.3000 - loss: 3.1388 Epoch 43/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 0.3000 - loss: 3.1177 Epoch 44/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step - accuracy: 0.3000 - loss: 3.0999 Epoch 45/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.3000 - loss: 3.0845 Epoch 46/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 0.3000 - loss: 3.0680 Epoch 47/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.3100 - loss: 3.0485 Epoch 48/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.3100 - loss: 3.0278 Epoch 49/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.3100 - loss: 3.0105 Epoch 50/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.3100 - loss: 2.9937 Epoch 51/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 0.3100 - loss: 2.9697 Epoch 52/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.3100 - loss: 2.9446 Epoch 53/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 0.3100 - loss: 2.9242 Epoch 54/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - accuracy: 0.3100 - loss: 2.9035 Epoch 55/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 100ms/step - accuracy: 0.3100 - loss: 2.8792 Epoch 56/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.3200 - loss: 2.8560 Epoch 57/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.3200 - loss: 2.8347 Epoch 58/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step - accuracy: 0.3300 - loss: 2.8067 Epoch 59/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.3300 - loss: 2.7816 Epoch 60/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 0.3300 - loss: 2.7562 Epoch 61/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 74ms/step - accuracy: 0.3200 - loss: 2.7267 Epoch 62/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 90ms/step - accuracy: 0.3400 - loss: 2.7017 Epoch 63/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.3200 - loss: 2.6702 Epoch 64/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 0.3300 - loss: 2.6443 Epoch 65/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.3300 - loss: 2.6148 Epoch 66/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - accuracy: 0.3500 - loss: 2.5832 Epoch 67/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - accuracy: 0.3400 - loss: 2.5539 Epoch 68/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.3500 - loss: 2.5255 Epoch 69/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.3400 - loss: 2.5049 Epoch 70/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step - accuracy: 0.3500 - loss: 2.4637 Epoch 71/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 0.3500 - loss: 2.4268 Epoch 72/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 0.3500 - loss: 2.3891 Epoch 73/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 0.3500 - loss: 2.3572 Epoch 74/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 0.3500 - loss: 2.3324 Epoch 75/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 0.3900 - loss: 2.3256 Epoch 76/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 90ms/step - accuracy: 0.3300 - loss: 2.3474 Epoch 77/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 92ms/step - accuracy: 0.3800 - loss: 2.2322 Epoch 78/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 0.4100 - loss: 2.3183 Epoch 79/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.3800 - loss: 2.1970 Epoch 80/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 0.3600 - loss: 2.2326 Epoch 81/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 0.3900 - loss: 2.1391 Epoch 82/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 0.4100 - loss: 2.1619 Epoch 83/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.4100 - loss: 2.0924 Epoch 84/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 0.3800 - loss: 2.0742 Epoch 85/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.3800 - loss: 2.0562 Epoch 86/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - accuracy: 0.4300 - loss: 2.0012 Epoch 87/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 0.4300 - loss: 2.0019 Epoch 88/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.4200 - loss: 1.9425 Epoch 89/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.4100 - loss: 1.9439 Epoch 90/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step - accuracy: 0.4200 - loss: 1.8993 Epoch 91/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.4600 - loss: 1.8809 Epoch 92/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.4600 - loss: 1.8483 Epoch 93/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.4400 - loss: 1.8227 Epoch 94/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.4400 - loss: 1.7997 Epoch 95/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 0.5000 - loss: 1.7690 Epoch 96/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 0.5100 - loss: 1.7463 Epoch 97/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 0.5100 - loss: 1.7197 Epoch 98/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.5100 - loss: 1.6983 Epoch 99/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 61ms/step - accuracy: 0.5400 - loss: 1.6724 Epoch 100/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 0.5400 - loss: 1.6456 Epoch 101/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.5300 - loss: 1.6274 Epoch 102/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 0.5500 - loss: 1.5962 Epoch 103/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 0.5500 - loss: 1.5837 Epoch 104/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.5500 - loss: 1.5554 Epoch 105/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.5700 - loss: 1.5314 Epoch 106/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 0.6100 - loss: 1.5176 Epoch 107/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.5900 - loss: 1.4886 Epoch 108/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.6200 - loss: 1.4652 Epoch 109/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 0.6100 - loss: 1.4522 Epoch 110/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.6000 - loss: 1.4388 Epoch 111/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.6400 - loss: 1.4074 Epoch 112/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.6500 - loss: 1.3860 Epoch 113/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 0.6200 - loss: 1.3751 Epoch 114/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.6600 - loss: 1.3595 Epoch 115/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.6600 - loss: 1.3369 Epoch 116/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 0.6800 - loss: 1.3115 Epoch 117/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 0.7100 - loss: 1.2924 Epoch 118/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 0.6800 - loss: 1.2840 Epoch 119/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.6900 - loss: 1.2729 Epoch 120/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.6900 - loss: 1.2609 Epoch 121/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.7400 - loss: 1.2320 Epoch 122/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 0.7300 - loss: 1.2063 Epoch 123/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 63ms/step - accuracy: 0.7200 - loss: 1.1981 Epoch 124/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - accuracy: 0.7700 - loss: 1.1900 Epoch 125/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.7100 - loss: 1.1828 Epoch 126/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 76ms/step - accuracy: 0.8000 - loss: 1.1538 Epoch 127/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.8000 - loss: 1.1297 Epoch 128/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.7600 - loss: 1.1263 Epoch 129/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 0.8100 - loss: 1.1154 Epoch 130/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 0.7800 - loss: 1.1014 Epoch 131/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 0.8100 - loss: 1.0742 Epoch 132/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 69ms/step - accuracy: 0.8400 - loss: 1.0629 Epoch 133/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - accuracy: 0.7700 - loss: 1.0630 Epoch 134/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 101ms/step - accuracy: 0.8400 - loss: 1.0400 Epoch 135/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step - accuracy: 0.8200 - loss: 1.0175 Epoch 136/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 0.8100 - loss: 1.0053 Epoch 137/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 0.8700 - loss: 0.9995 Epoch 138/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 0.8300 - loss: 0.9842 Epoch 139/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 0.8500 - loss: 0.9628 Epoch 140/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 0.8800 - loss: 0.9520 Epoch 141/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.8500 - loss: 0.9450 Epoch 142/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 0.9100 - loss: 0.9289 Epoch 143/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 0.9100 - loss: 0.9112 Epoch 144/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 0.9000 - loss: 0.8997 Epoch 145/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 0.9300 - loss: 0.8901 Epoch 146/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.8800 - loss: 0.8793 Epoch 147/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 0.9400 - loss: 0.8626 Epoch 148/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 0.9400 - loss: 0.8492 Epoch 149/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 0.9100 - loss: 0.8384 Epoch 150/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 0.9300 - loss: 0.8292 Epoch 151/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 0.9300 - loss: 0.8173 Epoch 152/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 84ms/step - accuracy: 0.9700 - loss: 0.8036 Epoch 153/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9700 - loss: 0.7899 Epoch 154/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 0.9800 - loss: 0.7789 Epoch 155/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 0.9900 - loss: 0.7693 Epoch 156/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 73ms/step - accuracy: 0.9800 - loss: 0.7599 Epoch 157/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 0.9900 - loss: 0.7492 Epoch 158/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 0.9800 - loss: 0.7383 Epoch 159/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 0.9900 - loss: 0.7264 Epoch 160/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 0.7153 Epoch 161/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 0.7039 Epoch 162/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 1.0000 - loss: 0.6932 Epoch 163/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.6832 Epoch 164/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 52ms/step - accuracy: 1.0000 - loss: 0.6735 Epoch 165/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.6639 Epoch 166/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 0.6548 Epoch 167/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.6462 Epoch 168/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - accuracy: 1.0000 - loss: 0.6386 Epoch 169/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step - accuracy: 1.0000 - loss: 0.6343 Epoch 170/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - accuracy: 1.0000 - loss: 0.6325 Epoch 171/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 0.9400 - loss: 0.6440 Epoch 172/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 1.0000 - loss: 0.6334 Epoch 173/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 0.9800 - loss: 0.6149 Epoch 174/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 0.5880 Epoch 175/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.5873 Epoch 176/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 0.9800 - loss: 0.5891 Epoch 177/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 1.0000 - loss: 0.5694 Epoch 178/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 0.5590 Epoch 179/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.5595 Epoch 180/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 1.0000 - loss: 0.5475 Epoch 181/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step - accuracy: 1.0000 - loss: 0.5378 Epoch 182/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 96ms/step - accuracy: 1.0000 - loss: 0.5342 Epoch 183/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 1.0000 - loss: 0.5274 Epoch 184/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step - accuracy: 1.0000 - loss: 0.5175 Epoch 185/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 105ms/step - accuracy: 1.0000 - loss: 0.5130 Epoch 186/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 75ms/step - accuracy: 1.0000 - loss: 0.5069 Epoch 187/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 1.0000 - loss: 0.4991 Epoch 188/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 58ms/step - accuracy: 1.0000 - loss: 0.4931 Epoch 189/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 0.4888 Epoch 190/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 0.4818 Epoch 191/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 1.0000 - loss: 0.4756 Epoch 192/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 0.4710 Epoch 193/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 0.4658 Epoch 194/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.4592 Epoch 195/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 0.4545 Epoch 196/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.4499 Epoch 197/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 1.0000 - loss: 0.4442 Epoch 198/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 0.4388 Epoch 199/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 0.4347 Epoch 200/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 1.0000 - loss: 0.4298 Epoch 201/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 0.4247 Epoch 202/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 1.0000 - loss: 0.4201 Epoch 203/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 0.4158 Epoch 204/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.4114 Epoch 205/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.4067 Epoch 206/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.4022 Epoch 207/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 0.3982 Epoch 208/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 59ms/step - accuracy: 1.0000 - loss: 0.3941 Epoch 209/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.3900 Epoch 210/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 0.3857 Epoch 211/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 0.3817 Epoch 212/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 0.3778 Epoch 213/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 0.3741 Epoch 214/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 1.0000 - loss: 0.3704 Epoch 215/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 1.0000 - loss: 0.3666 Epoch 216/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 1.0000 - loss: 0.3629 Epoch 217/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.3592 Epoch 218/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 1.0000 - loss: 0.3556 Epoch 219/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 1.0000 - loss: 0.3521 Epoch 220/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 78ms/step - accuracy: 1.0000 - loss: 0.3487 Epoch 221/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 55ms/step - accuracy: 1.0000 - loss: 0.3454 Epoch 222/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.3421 Epoch 223/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 1.0000 - loss: 0.3389 Epoch 224/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step - accuracy: 1.0000 - loss: 0.3357 Epoch 225/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 92ms/step - accuracy: 1.0000 - loss: 0.3326 Epoch 226/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - accuracy: 1.0000 - loss: 0.3296 Epoch 227/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - accuracy: 1.0000 - loss: 0.3268 Epoch 228/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 91ms/step - accuracy: 1.0000 - loss: 0.3242 Epoch 229/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step - accuracy: 1.0000 - loss: 0.3219 Epoch 230/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step - accuracy: 1.0000 - loss: 0.3201 Epoch 231/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step - accuracy: 1.0000 - loss: 0.3191 Epoch 232/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 1.0000 - loss: 0.3181 Epoch 233/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 82ms/step - accuracy: 1.0000 - loss: 0.3168 Epoch 234/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 81ms/step - accuracy: 1.0000 - loss: 0.3125 Epoch 235/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 92ms/step - accuracy: 1.0000 - loss: 0.3065 Epoch 236/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 1.0000 - loss: 0.3004 Epoch 237/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 0.2979 Epoch 238/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 1.0000 - loss: 0.2977 Epoch 239/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 87ms/step - accuracy: 1.0000 - loss: 0.2953 Epoch 240/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 98ms/step - accuracy: 1.0000 - loss: 0.2907 Epoch 241/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 113ms/step - accuracy: 1.0000 - loss: 0.2868 Epoch 242/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 95ms/step - accuracy: 1.0000 - loss: 0.2855 Epoch 243/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 91ms/step - accuracy: 1.0000 - loss: 0.2840 Epoch 244/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 70ms/step - accuracy: 1.0000 - loss: 0.2804 Epoch 245/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.2769 Epoch 246/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 0.2751 Epoch 247/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 93ms/step - accuracy: 1.0000 - loss: 0.2734 Epoch 248/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.2705 Epoch 249/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 77ms/step - accuracy: 1.0000 - loss: 0.2674 Epoch 250/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.2654 Epoch 251/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 103ms/step - accuracy: 1.0000 - loss: 0.2637 Epoch 252/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.2610 Epoch 253/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 0.2583 Epoch 254/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 1.0000 - loss: 0.2563 Epoch 255/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.2544 Epoch 256/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.2520 Epoch 257/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 71ms/step - accuracy: 1.0000 - loss: 0.2496 Epoch 258/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 64ms/step - accuracy: 1.0000 - loss: 0.2476 Epoch 259/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 62ms/step - accuracy: 1.0000 - loss: 0.2457 Epoch 260/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 1.0000 - loss: 0.2435 Epoch 261/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 79ms/step - accuracy: 1.0000 - loss: 0.2412 Epoch 262/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 72ms/step - accuracy: 1.0000 - loss: 0.2392 Epoch 263/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 83ms/step - accuracy: 1.0000 - loss: 0.2374 Epoch 264/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 107ms/step - accuracy: 1.0000 - loss: 0.2354 Epoch 265/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 105ms/step - accuracy: 1.0000 - loss: 0.2332 Epoch 266/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 97ms/step - accuracy: 1.0000 - loss: 0.2313 Epoch 267/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 86ms/step - accuracy: 1.0000 - loss: 0.2295 Epoch 268/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 60ms/step - accuracy: 1.0000 - loss: 0.2276 Epoch 269/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 56ms/step - accuracy: 1.0000 - loss: 0.2256 Epoch 270/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 0.2236 Epoch 271/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.2219 Epoch 272/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.2201 Epoch 273/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.2182 Epoch 274/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 45ms/step - accuracy: 1.0000 - loss: 0.2164 Epoch 275/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 44ms/step - accuracy: 1.0000 - loss: 0.2147 Epoch 276/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 51ms/step - accuracy: 1.0000 - loss: 0.2129 Epoch 277/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 57ms/step - accuracy: 1.0000 - loss: 0.2112 Epoch 278/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.2094 Epoch 279/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 80ms/step - accuracy: 1.0000 - loss: 0.2077 Epoch 280/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 0.2061 Epoch 281/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.2044 Epoch 282/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 49ms/step - accuracy: 1.0000 - loss: 0.2028 Epoch 283/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 38ms/step - accuracy: 1.0000 - loss: 0.2011 Epoch 284/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.1995 Epoch 285/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 1.0000 - loss: 0.1979 Epoch 286/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 50ms/step - accuracy: 1.0000 - loss: 0.1963 Epoch 287/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 0.1947 Epoch 288/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 48ms/step - accuracy: 1.0000 - loss: 0.1932 Epoch 289/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 54ms/step - accuracy: 1.0000 - loss: 0.1916 Epoch 290/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 53ms/step - accuracy: 1.0000 - loss: 0.1901 Epoch 291/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 46ms/step - accuracy: 1.0000 - loss: 0.1886 Epoch 292/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 43ms/step - accuracy: 1.0000 - loss: 0.1871 Epoch 293/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 65ms/step - accuracy: 1.0000 - loss: 0.1856 Epoch 294/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 89ms/step - accuracy: 1.0000 - loss: 0.1842 Epoch 295/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 98ms/step - accuracy: 1.0000 - loss: 0.1827 Epoch 296/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 94ms/step - accuracy: 1.0000 - loss: 0.1813 Epoch 297/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 1.0000 - loss: 0.1799 Epoch 298/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 85ms/step - accuracy: 1.0000 - loss: 0.1784 Epoch 299/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 88ms/step - accuracy: 1.0000 - loss: 0.1770 Epoch 300/300 1/1 ━━━━━━━━━━━━━━━━━━━━ 0s 47ms/step - accuracy: 1.0000 - loss: 0.1757
<keras.src.callbacks.history.History at 0x19e6687b830>
Classification Model Summary¶
model_classification.summary()
Model: "classification_model_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ input_ids_1 (InputLayer) │ (None, 18) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ embedding_task_number_1 │ (None, 18, 64) │ 7,360 │ │ (Embedding) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ LSTM_Layer_1 (LSTM) │ (None, 128) │ 98,816 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ classifier_dense_1 (Dense) │ (None, 2) │ 258 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 319,304 (1.22 MB)
Trainable params: 106,434 (415.76 KB)
Non-trainable params: 0 (0.00 B)
Optimizer params: 212,870 (831.53 KB)
Summarisation Model Summary¶
model_summary.summary()
Model: "summary_model_2"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ input_ids_2 (InputLayer) │ (None, 18) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ embedding_task_number_2 │ (None, 18, 64) │ 7,360 │ │ (Embedding) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ LSTM_2 (LSTM) │ (None, 128) │ 98,816 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ Repeat_Vector_2 (RepeatVector) │ (None, 10, 128) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ decode_LSTM_2 (LSTM) │ (None, 10, 128) │ 131,584 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ summary_dense_2 (Dense) │ (None, 10, 115) │ 14,835 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 757,787 (2.89 MB)
Trainable params: 252,595 (986.70 KB)
Non-trainable params: 0 (0.00 B)
Optimizer params: 505,192 (1.93 MB)
Classification Model Weight Saving¶
model_classification.save_weights("model_classification_trained.weights.h5")
Summarisation Model Weight Saving¶
model_summary.save_weights("model_summary_trained.weights.h5")
Second Model Creation and classification model initialisation and Classification Model Weights Loading¶
Config parameter would be same as first model because we need to load the first model weights¶
second_model=KratimBudhimataModel()
second_model.use_search=False
second_model.num_classes=2
second_model.vocab_size=vocab_size
second_model.embed_dim=embed_dim
second_model.lstm_units=lstm_units
second_model.max_prompt_len=max_prompt_len
second_model.max_summary_len=max_summary_len
second_model.model_creation_process(1, model_type="text_classification", use_search=False)
second_model.summary()
Registered Model Num: 1 <keras.src.optimizers.adam.Adam object at 0x0000019E6A5C7D10> compilation done for 1
Model: "kratim_budhimata_model_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ classification_model_1 │ (None, 2) │ 106,434 │ │ (Functional) │ │ │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 106,434 (415.76 KB)
Trainable params: 106,434 (415.76 KB)
Non-trainable params: 0 (0.00 B)
Load Classification Model Trained and Saved Weights in Second Model 0 layer for Prediction¶
second_model.layers[0].load_weights("model_classification_trained.weights.h5")
second_model.layers[0].summary()
Model: "classification_model_1"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ input_ids_1 (InputLayer) │ (None, 18) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ embedding_task_number_1 │ (None, 18, 64) │ 7,360 │ │ (Embedding) │ │ │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ LSTM_Layer_1 (LSTM) │ (None, 128) │ 98,816 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ classifier_dense_1 (Dense) │ (None, 2) │ 258 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 106,434 (415.76 KB)
Trainable params: 106,434 (415.76 KB)
Non-trainable params: 0 (0.00 B)
Prediction - First Classification Model is used to predict the class of the prompt based on that relevent model would be used for prediction like if class would be 1 then summarisation model would be called. Thus only relevent model would be used for prediction which will save RAM and CPU/GPU computation in huge amount.¶
index_word = {i: w for w, i in tokenizer.word_index.items()}
def decoder_function_sequence(seq):
return ' '.join([index_word.get(i, '') for i in seq if i != 0])
prediction_class = second_model.layers[0].predict(prompts_x)
1/1 ━━━━━━━━━━━━━━━━━━━━ 1s 572ms/step
Predicted Classes¶
Classification Model Accuracy¶
predicted_labels = np.argmax(prediction_class, axis=1)
#true_classes = np.argmax(labels, axis=0)
true_classes=np.array(labels)
classification_model_accuracy = accuracy_score(true_classes, predicted_labels)
classification_model_accuracy
1.0
Check which response has 1 class and create subset of prompts which has 1 classes¶
true_classes
array([1, 1, 1, 1, 1, 0, 0, 0, 0, 0])
#for the labels have value 1
predicted_labels
array([1, 1, 1, 1, 1, 0, 0, 0, 0, 0], dtype=int64)
First 5 labels are 1 class it means these prompts can be predicted with summarisation model. So create the subset of prompts for prediction using Summarisation model¶
second_model.model_creation_process(2, model_type="text_summarisation", use_search=False)
Registered Model Num: 2 <keras.src.optimizers.adam.Adam object at 0x0000019E67D37A70> compilation done for 2
Load Summarisation Model into Second Model 1 layer which was trained and saved weights by first model¶
Second Model- Summarisation model(layer) initialisation and Load weight of first model's trained summarisation model into this model for prediction¶
second_model.layers[1].load_weights("model_summary_trained.weights.h5")
Create a subset of prompts which have label as 1¶
prompts_x[0:5]
array([[ 3, 7, 1, 20, 2, 21, 46, 13, 47, 48, 22, 8, 49, 50, 51, 52,
0, 0],
[ 9, 23, 24, 4, 14, 15, 16, 17, 25, 18, 5, 53, 26, 3, 54, 0,
0, 0],
[55, 18, 27, 10, 15, 26, 17, 56, 57, 58, 3, 7, 1, 10, 59, 4,
60, 61],
[28, 1, 62, 29, 16, 63, 30, 2, 31, 32, 33, 64, 65, 16, 66, 67,
0, 0],
[34, 1, 68, 11, 8, 27, 4, 6, 35, 19, 69, 0, 0, 0, 0, 0,
0, 0]])
summary_y[0:5]
array([[ 3, 7, 1, 20, 21, 13, 90, 91, 22, 0],
[ 3, 7, 1, 92, 93, 94, 25, 95, 0, 0],
[ 17, 96, 12, 45, 15, 97, 3, 7, 0, 0],
[ 28, 98, 29, 9, 99, 0, 0, 0, 0, 0],
[ 34, 1, 100, 11, 8, 5, 6, 35, 0, 0]])
Summarisation Model Prediction¶
prediction_summary_probs = second_model.layers[1].predict(prompts_x[0:5])
predicted_summary_indices = np.argmax(prediction_summary_probs, axis=-1)
predicted_summary_texts = [decoder_function_sequence(seq) for seq in predicted_summary_indices]
1/1 ━━━━━━━━━━━━━━━━━━━━ 1s 610ms/step
predicted_summary_texts
['kratim budhimata is evolving ai by cutting edge innovations', 'kratim budhimata is building innovative solutions cost effectively', 'low compute and more results meaning kratim budhimata', 'weather remains unpredictible when rains', 'relationship is private thing which should be respected']
actual_summary_texts = [decoder_function_sequence(seq) for seq in summary_y[0:5]]
actual_summary_texts
['kratim budhimata is evolving ai by cutting edge innovations', 'kratim budhimata is building innovative solutions cost effectively', 'low compute and more results meaning kratim budhimata', 'weather remains unpredictible when rains', 'relationship is private thing which should be respected']
Summarisation Model Accuracy¶
summary_match_outcome = [p.strip() == t.strip() for p, t in zip(predicted_summary_texts, actual_summary_texts)]
summary_accuracy = sum(summary_match_outcome) / len(summary_match_outcome)
print("Summary Model Accuracy (Pass@1):", round(summary_accuracy * 100, 2), "%")
Summary Model Accuracy (Pass@1): 100.0 %
print("\n Predicted and Actual Comparision")
for i in range(len(prompts[0:5])):
print(f"- Prompt: {prompts[i]}")
print(f" Predicted Class: {predicted_labels[i]} | Actual Class: {true_classes[i]}")
print(f" Predicted Summary: {predicted_summary_texts[i]}")
print(f" Actual Summary : {actual_summary_texts[i]}")
print(f" Summary Match Outcome : {'True' if summary_match_outcome[i] else 'False'}\n")
Predicted and Actual Comparision - Prompt: Kratim Budhimata is evolving the AI field by Long term innovations which can make real difference. Predicted Class: 1 | Actual Class: 1 Predicted Summary: kratim budhimata is evolving ai by cutting edge innovations Actual Summary : kratim budhimata is evolving ai by cutting edge innovations Summary Match Outcome : True - Prompt: When it comes to better results in low cost one should connect with Kratim Budhimata. Predicted Class: 1 | Actual Class: 1 Predicted Summary: kratim budhimata is building innovative solutions cost effectively Actual Summary : kratim budhimata is building innovative solutions cost effectively Summary Match Outcome : True - Prompt: If one needs best results with low computation limitation then Kratim Budhimata is best place to reach out. Predicted Class: 1 | Actual Class: 1 Predicted Summary: low compute and more results meaning kratim budhimata Actual Summary : low compute and more results meaning kratim budhimata Summary Match Outcome : True - Prompt: Weather is very unpredictible in most of the areas now a days mostly in rainy seasons. Predicted Class: 1 | Actual Class: 1 Predicted Summary: weather remains unpredictible when rains Actual Summary : weather remains unpredictible when rains Summary Match Outcome : True - Prompt: Relationship is personal thing which needs to be respected for privacy. Predicted Class: 1 | Actual Class: 1 Predicted Summary: relationship is private thing which should be respected Actual Summary : relationship is private thing which should be respected Summary Match Outcome : True
Summary model accuracy for all prompts¶
prediction_summary_probs = second_model.layers[1].predict(prompts_x)
1/1 ━━━━━━━━━━━━━━━━━━━━ 1s 788ms/step
predicted_summary_indices = np.argmax(prediction_summary_probs, axis=-1)
predicted_summary_texts = [decoder_function_sequence(seq) for seq in predicted_summary_indices]
actual_summary_texts = [decoder_function_sequence(seq) for seq in summary_y]
predicted_summary_texts
['kratim budhimata is evolving ai by cutting edge innovations', 'kratim budhimata is building innovative solutions cost effectively', 'low compute and more results meaning kratim budhimata', 'weather remains unpredictible when rains', 'relationship is private thing which should be respected', 'global warming should be addressed', 'always do the best thing first.', 'bravery matters when going gets tough', 'food and health is need and its not a priviledge', 'good direction often leads to the better places']
actual_summary_texts
['kratim budhimata is evolving ai by cutting edge innovations', 'kratim budhimata is building innovative solutions cost effectively', 'low compute and more results meaning kratim budhimata', 'weather remains unpredictible when rains', 'relationship is private thing which should be respected', 'global warming should be addressed', 'always do the best thing first.', 'bravery matters when going gets tough', 'food and health is need and its not a priviledge', 'good direction often leads to the better places']
summary_match_outcome = [p.strip() == t.strip() for p, t in zip(predicted_summary_texts, actual_summary_texts)]
summary_accuracy = sum(summary_match_outcome) / len(summary_match_outcome)
print("Summary Model Accuracy (Pass@1):", round(summary_accuracy * 100, 2), "%")
Summary Model Accuracy (Pass@1): 100.0 %
print("\n Predicted and Actual Comparision")
for i in range(len(prompts)):
print(f"- Prompt: {prompts[i]}")
print(f" Predicted Class: {predicted_labels[i]} | Actual Class: {true_classes[i]}")
print(f" Predicted Summary: {predicted_summary_texts[i]}")
print(f" Actual Summary : {actual_summary_texts[i]}")
print(f" Summary Match Outcome : {'True' if summary_match_outcome[i] else 'False'}\n")
Predicted and Actual Comparision - Prompt: Kratim Budhimata is evolving the AI field by Long term innovations which can make real difference. Predicted Class: 1 | Actual Class: 1 Predicted Summary: kratim budhimata is evolving ai by cutting edge innovations Actual Summary : kratim budhimata is evolving ai by cutting edge innovations Summary Match Outcome : True - Prompt: When it comes to better results in low cost one should connect with Kratim Budhimata. Predicted Class: 1 | Actual Class: 1 Predicted Summary: kratim budhimata is building innovative solutions cost effectively Actual Summary : kratim budhimata is building innovative solutions cost effectively Summary Match Outcome : True - Prompt: If one needs best results with low computation limitation then Kratim Budhimata is best place to reach out. Predicted Class: 1 | Actual Class: 1 Predicted Summary: low compute and more results meaning kratim budhimata Actual Summary : low compute and more results meaning kratim budhimata Summary Match Outcome : True - Prompt: Weather is very unpredictible in most of the areas now a days mostly in rainy seasons. Predicted Class: 1 | Actual Class: 1 Predicted Summary: weather remains unpredictible when rains Actual Summary : weather remains unpredictible when rains Summary Match Outcome : True - Prompt: Relationship is personal thing which needs to be respected for privacy. Predicted Class: 1 | Actual Class: 1 Predicted Summary: relationship is private thing which should be respected Actual Summary : relationship is private thing which should be respected Summary Match Outcome : True - Prompt: Global warming is one of the areas where the world should look into it. Predicted Class: 0 | Actual Class: 0 Predicted Summary: global warming should be addressed Actual Summary : global warming should be addressed Summary Match Outcome : True - Prompt: Best time to do the Great thing is now Predicted Class: 0 | Actual Class: 0 Predicted Summary: always do the best thing first. Actual Summary : always do the best thing first. Summary Match Outcome : True - Prompt: Bravery cannot be replaced by anything which should be present Predicted Class: 0 | Actual Class: 0 Predicted Summary: bravery matters when going gets tough Actual Summary : bravery matters when going gets tough Summary Match Outcome : True - Prompt: Food and Health is basic requirement for human being which should be fulfilled for better world Predicted Class: 0 | Actual Class: 0 Predicted Summary: food and health is need and its not a priviledge Actual Summary : food and health is need and its not a priviledge Summary Match Outcome : True - Prompt: Direction matters more than speed when it comes to take critical decisions. Predicted Class: 0 | Actual Class: 0 Predicted Summary: good direction often leads to the better places Actual Summary : good direction often leads to the better places Summary Match Outcome : True