Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
94 changes: 94 additions & 0 deletions md/04.HOL/dotnet/csharplabs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
## Welcome to the Phi-3 labs using C#.

There is a selection of labs that showcases how to integrate the powerful different versions of Phi-3 models in a .NET environment.

## Prerequisites
Before running the sample, ensure you have the following installed:

**.NET 9:** Make sure you have the [latest version of .NET](https://dotnet.microsoft.com/download/dotnet/) installed on your machine.

**(Optional) Visual Studio or Visual Studio Code:** You will need an IDE or code editor capable of running .NET projects. [Visual Studio](https://visualstudio.microsoft.com/) or [Visual Studio Code](https://code.visualstudio.com/) are recommended.

**Using git** clone locally one of the available Phi-3, Phi3.5 or Phi-4 versions from [Hugging Face](https://huggingface.co).

**Download Phi-4 onnx models** to your local machine:

### navigate to the folder to store the models
```bash
cd c:\phi\models
```
### add support for lfs
```bash
git lfs install
```
### clone and download Phi-4 mini instruct model and the Phi-4 multi modal model

```bash
git clone https://huggingface.co/microsoft/Phi-4-mini-instruct-onnx

git clone https://huggingface.co/microsoft/Phi-4-multimodal-instruct-onnx
```

**Download the phi3-mini-4k-instruct-onnx model** to your local machine:

### navigate to the folder to store the models
```bash
cd c:\phi3\models
```
### add support for lfs
```bash
git lfs install
```
### clone and download mini 4K instruct model
```bash
git clone https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-onnx
```

### clone and download vision 128K model
```
git clone https://huggingface.co/microsoft/Phi-3-vision-128k-instruct-onnx-cpu
```
**Important:** The current demos are designed to use the ONNX versions of the model. The previous steps clone the following models.

## About the Labs

The main solution have several sample Labs that demonstrates the capabilities of the Phi-3 models using C#.

| Project | Description | Location |
| ------------ | ----------- | -------- |
| LabsPhi301 | This is a sample project that uses a local phi3 model to ask a question. The project load a local ONNX Phi-3 model using the `Microsoft.ML.OnnxRuntime` libraries. | .\src\LabsPhi301\ |
| LabsPhi302 | This is a sample project that implement a Console chat using Semantic Kernel. | .\src\LabsPhi302\ |
| LabsPhi303 | This is a sample project that uses a local phi3 vision model to analyze images.. The project load a local ONNX Phi-3 Vision model using the `Microsoft.ML.OnnxRuntime` libraries. | .\src\LabsPhi303\ |
| LabsPhi304 | This is a sample project that uses a local phi3 vision model to analyze images.. The project load a local ONNX Phi-3 Vision model using the `Microsoft.ML.OnnxRuntime` libraries. The project also presents a menu with different options to interacti with the user. | .\src\LabsPhi304\ |
| LabsPhi4-Chat-01OnnxRuntime | This is a sample project that uses a local Phi-4 model to work in a chat in the console. The project load a local ONNX Phi-4 model using the `Microsoft.ML.OnnxRuntime` libraries. | \src\LabsPhi4-Chat-01OnnxRuntime\ |
| LabsPhi4-Chat-02SK | This is a sample project that uses a local Phi-4 model to work in a chat in the console. The project load a local ONNX Phi-4 model using the `Semantic Kernel` libraries. | \src\LabsPhi4-Chat-02SK\ |


## How to Run the Projects

To run the projects, follow these steps:
1. Clone the repository to your local machine.

1. Open a terminal and navigate to the desired project. In example, let's run `LabsPhi4-Chat-01OnnxRuntime `.
```bash
cd .\src\LabsPhi4-Chat-01OnnxRuntime \
```

1. Run the project with the command
```bash
dotnet run
```

1. The sample project ask for a user input and replies using the local mode.

The running demo is similar to this one:

```bash
PS D:\phi\PhiCookBook\md\04.HOL\dotnet\src\LabsPhi4-Chat-01OnnxRuntime> dotnet run
Ask your question. Type an empty string to Exit.

Q: 2+2
Phi4: The sum of 2 and 2 is 4.

Q:
```
67 changes: 67 additions & 0 deletions md/04.HOL/dotnet/src/LabsPhi.sln
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@

Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 17
VisualStudioVersion = 17.10.34928.147
MinimumVisualStudioVersion = 10.0.40219.1
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "LabsPhi301", "LabsPhi301\LabsPhi301.csproj", "{22131B1B-1289-41DF-882F-A2E16A63BE9E}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "LabsPhi302", "LabsPhi302\LabsPhi302.csproj", "{1373D0EA-81B1-43BE-A8CA-0DD7A162FC3F}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "LabsPhi303", "LabsPhi303\LabsPhi303.csproj", "{1254AB34-B99A-4E4C-BD95-18BB22BF478E}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "LabsPhi304", "LabsPhi304\LabsPhi304.csproj", "{4EB2FBF6-75D3-4B10-B5B5-675469C24780}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Phi3", "Phi3", "{02EA681E-C7D8-13C7-8484-4AC65E1B71E8}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Phi4", "Phi4", "{9DF367B9-E0EA-4ABB-A144-5E13A9508A69}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "LabsPhi4-Chat-01OnnxRuntime", "LabsPhi4-Chat-01OnnxRuntime\LabsPhi4-Chat-01OnnxRuntime.csproj", "{C6986362-DDF7-6EF2-75EE-B042B4F4B4D2}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "LabsPhi4-Chat-02SK", "LabsPhi4-Chat-02SK\LabsPhi4-Chat-02SK.csproj", "{66045465-B4A8-D929-3D15-926FD376FE2E}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
Release|Any CPU = Release|Any CPU
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{22131B1B-1289-41DF-882F-A2E16A63BE9E}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{22131B1B-1289-41DF-882F-A2E16A63BE9E}.Debug|Any CPU.Build.0 = Debug|Any CPU
{22131B1B-1289-41DF-882F-A2E16A63BE9E}.Release|Any CPU.ActiveCfg = Release|Any CPU
{22131B1B-1289-41DF-882F-A2E16A63BE9E}.Release|Any CPU.Build.0 = Release|Any CPU
{1373D0EA-81B1-43BE-A8CA-0DD7A162FC3F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{1373D0EA-81B1-43BE-A8CA-0DD7A162FC3F}.Debug|Any CPU.Build.0 = Debug|Any CPU
{1373D0EA-81B1-43BE-A8CA-0DD7A162FC3F}.Release|Any CPU.ActiveCfg = Release|Any CPU
{1373D0EA-81B1-43BE-A8CA-0DD7A162FC3F}.Release|Any CPU.Build.0 = Release|Any CPU
{1254AB34-B99A-4E4C-BD95-18BB22BF478E}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{1254AB34-B99A-4E4C-BD95-18BB22BF478E}.Debug|Any CPU.Build.0 = Debug|Any CPU
{1254AB34-B99A-4E4C-BD95-18BB22BF478E}.Release|Any CPU.ActiveCfg = Release|Any CPU
{1254AB34-B99A-4E4C-BD95-18BB22BF478E}.Release|Any CPU.Build.0 = Release|Any CPU
{4EB2FBF6-75D3-4B10-B5B5-675469C24780}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{4EB2FBF6-75D3-4B10-B5B5-675469C24780}.Debug|Any CPU.Build.0 = Debug|Any CPU
{4EB2FBF6-75D3-4B10-B5B5-675469C24780}.Release|Any CPU.ActiveCfg = Release|Any CPU
{4EB2FBF6-75D3-4B10-B5B5-675469C24780}.Release|Any CPU.Build.0 = Release|Any CPU
{C6986362-DDF7-6EF2-75EE-B042B4F4B4D2}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{C6986362-DDF7-6EF2-75EE-B042B4F4B4D2}.Debug|Any CPU.Build.0 = Debug|Any CPU
{C6986362-DDF7-6EF2-75EE-B042B4F4B4D2}.Release|Any CPU.ActiveCfg = Release|Any CPU
{C6986362-DDF7-6EF2-75EE-B042B4F4B4D2}.Release|Any CPU.Build.0 = Release|Any CPU
{66045465-B4A8-D929-3D15-926FD376FE2E}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{66045465-B4A8-D929-3D15-926FD376FE2E}.Debug|Any CPU.Build.0 = Debug|Any CPU
{66045465-B4A8-D929-3D15-926FD376FE2E}.Release|Any CPU.ActiveCfg = Release|Any CPU
{66045465-B4A8-D929-3D15-926FD376FE2E}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(NestedProjects) = preSolution
{22131B1B-1289-41DF-882F-A2E16A63BE9E} = {02EA681E-C7D8-13C7-8484-4AC65E1B71E8}
{1373D0EA-81B1-43BE-A8CA-0DD7A162FC3F} = {02EA681E-C7D8-13C7-8484-4AC65E1B71E8}
{1254AB34-B99A-4E4C-BD95-18BB22BF478E} = {02EA681E-C7D8-13C7-8484-4AC65E1B71E8}
{4EB2FBF6-75D3-4B10-B5B5-675469C24780} = {02EA681E-C7D8-13C7-8484-4AC65E1B71E8}
{C6986362-DDF7-6EF2-75EE-B042B4F4B4D2} = {9DF367B9-E0EA-4ABB-A144-5E13A9508A69}
{66045465-B4A8-D929-3D15-926FD376FE2E} = {9DF367B9-E0EA-4ABB-A144-5E13A9508A69}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {EAA25EC1-C5F2-40DC-8080-4DAF13311AEE}
EndGlobalSection
EndGlobal
16 changes: 16 additions & 0 deletions md/04.HOL/dotnet/src/LabsPhi301/LabsPhi301.csproj
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="Microsoft.ML.OnnxRuntime" Version="1.18.0" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI" Version="0.3.0-rc2" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI.Cuda" Version="0.3.0-rc2" />
</ItemGroup>

</Project>
71 changes: 71 additions & 0 deletions md/04.HOL/dotnet/src/LabsPhi301/Program.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
// Copyright (c) 2024
// Author : Bruno Capuano
// Change Log :
//
// The MIT License (MIT)
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
// THE SOFTWARE.

using Microsoft.ML.OnnxRuntimeGenAI;


var modelPath = @"D:\phi3\models\Phi-3-mini-4k-instruct-onnx\cpu_and_mobile\cpu-int4-rtn-block-32";
var model = new Model(modelPath);
var tokenizer = new Tokenizer(model);

var systemPrompt = "You are an AI assistant that helps people find information. Answer questions using a direct style. Do not share more information that the requested by the users.";

// chat start
Console.WriteLine(@"Ask your question. Type an empty string to Exit.");


// chat loop
while (true)
{
// Get user question
Console.WriteLine();
Console.Write(@"Q: ");
var userQ = Console.ReadLine();
if (string.IsNullOrEmpty(userQ))
{
break;
}

// show phi3 response
Console.Write("Phi3: ");
var fullPrompt = $"<|system|>{systemPrompt}<|end|><|user|>{userQ}<|end|><|assistant|>";
var tokens = tokenizer.Encode(fullPrompt);

var generatorParams = new GeneratorParams(model);
generatorParams.SetSearchOption("max_length", 2048);
generatorParams.SetSearchOption("past_present_share_buffer", false);
generatorParams.SetInputSequences(tokens);

var generator = new Generator(model, generatorParams);
while (!generator.IsDone())
{
generator.ComputeLogits();
generator.GenerateNextToken();
var outputTokens = generator.GetSequence(0);
var newToken = outputTokens.Slice(outputTokens.Length - 1, 1);
var output = tokenizer.Decode(newToken);
Console.Write(output);
}
Console.WriteLine();
}
22 changes: 22 additions & 0 deletions md/04.HOL/dotnet/src/LabsPhi302/LabsPhi302.csproj
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="feiyun0112.SemanticKernel.Connectors.OnnxRuntimeGenAI.CPU" Version="1.0.0" />
<PackageReference Include="Microsoft.ML.OnnxRuntime" Version="1.18.0" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI" Version="0.3.0-rc2" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI.Cuda" Version="0.3.0-rc2" />
<PackageReference Include="Microsoft.Extensions.Configuration.UserSecrets" Version="9.0.0-preview.4.24266.19" />
<PackageReference Include="Microsoft.Extensions.Logging" Version="9.0.0-preview.4.24266.19" />
<PackageReference Include="Microsoft.Extensions.Logging.Console" Version="9.0.0-preview.4.24266.19" />
<PackageReference Include="Microsoft.SemanticKernel" Version="1.13.0" />
<PackageReference Include="Microsoft.SemanticKernel.Connectors.Onnx" Version="1.13.0-alpha" />
</ItemGroup>

</Project>
60 changes: 60 additions & 0 deletions md/04.HOL/dotnet/src/LabsPhi302/Program.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
// Copyright (c) 2024
// Author : Bruno Capuano
// Change Log :
//
// The MIT License (MIT)
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
// THE SOFTWARE.

using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;

var modelPath = @"D:\phi3\models\Phi-3-mini-4k-instruct-onnx\cpu_and_mobile\cpu-int4-rtn-block-32";

// create kernel
var builder = Kernel.CreateBuilder();
builder.AddOnnxRuntimeGenAIChatCompletion(modelPath: modelPath);
var kernel = builder.Build();

// create chat
var chat = kernel.GetRequiredService<IChatCompletionService>();
var history = new ChatHistory();

// run chat
while (true)
{
Console.Write("Q: ");
var userQ = Console.ReadLine();
if (string.IsNullOrEmpty(userQ))
{
break;
}
history.AddUserMessage(userQ);

Console.Write($"Phi3: ");
var response = "";
var result = chat.GetStreamingChatMessageContentsAsync(history);
await foreach (var message in result)
{
Console.Write(message.Content);
response += message.Content;
}
history.AddAssistantMessage(response);
Console.WriteLine("");
}
34 changes: 34 additions & 0 deletions md/04.HOL/dotnet/src/LabsPhi303/LabsPhi303.csproj
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="Microsoft.ML.OnnxRuntime" Version="1.18.0" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI" Version="0.3.0-rc2" />
<PackageReference Include="Microsoft.ML.OnnxRuntimeGenAI.Cuda" Version="0.3.0-rc2" />
</ItemGroup>

<ItemGroup>
<None Update="imgs\foggyday.jpg">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
<None Update="imgs\foggyday.png">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
<None Update="imgs\petsmusic.jpg">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
<None Update="imgs\petsmusic.png">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
<None Update="imgs\ultrarunningmug.png">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</None>
</ItemGroup>

</Project>
Loading
Loading