Key Features
Dataset Processing: The system loads a text dataset, tokenizes sentences into states, and constructs a transition matrix here each state predicts the next based on frequency.
Markov Model Construction: Words are mapped to hashed indices, and transitions are stored in a weighted matrix. The system normalizes probabilities to ensure valid state transitions.
Command-Based Interactions:
Build Markov: Builds based on Markov order (e.g. 1-4) and for both backward and forward generation matrix.
Forward Generation: Generates a sentence in a forward direction.
Backward Generation: Generates a sentence in a backward direction.
Save Session: Saves each generates sentence on a topic into a session.
Load Session: Loads a saved conversation.
Text Generation: Supports forward and backward generation. Includes state memory for contextual responses and dynamic sentence variation. Generated results are printed directly to the dev console.