In this article, we discuss Phi 1.5 which is a 1.3 billion parameters language model from Microsoft. ...
Phi 1.5 – Introduction and Analysis
In this article, we discuss Phi 1.5 which is a 1.3 billion parameters language model from Microsoft. ...
In this article, we create an instruction following Jupyter Notebook interface to prompt a fine-tuned Phi 1.5 model. ...
In this article, we are fine tuning the Qwen 1.5 0.5B model on the CodeAlpaca dataset for coding. We use the Hugging Face Transformers SFT Trainer pipeline. ...
In this article, we are fine tuning the Phi 1.5 model using QLoRA on the Stanford Alpaca dataset with Hugging Face Transformers. ...
In this article, we use the Hugging Face Autotrain no code platform to train the GPT2 Large model for following instructions. ...
Business WordPress Theme copyright 2025