Support Vector Machine Classification Using a C# Wrapper over LibSVM

Support Vector Machine (SVM) classification is a very complex algorithm that can make binary predictions (the thing to predict can be one of just two things, such as “male” or “female”).

A few years ago I implemented SVM from scratch and it took several weeks. Over the past few years, one implementation called libsvm has become the de facto standard and is used by many machine learning libraries such as scikit-learn.

The libsvm library is written in C++ but has been ported to many other programming languages. I wondered if anyone had tried to port libsvm to C#. I found such a port at https://github.com/ccerhan/LibSVMsharp by a guy named Can Erhan so I decided to give it a try.

I launched Visual Studio 2015 and created a new C# Console Application. I right-clicked on the Project name and selected the Manage NuGet Packages, and then did a search for “libsvmsharp” and located and installed the package.

I created a dummy problem, using the rather annoying libsvm data format:

-1 1:4 2:5 3:7
-1 1:7 2:4 3:2
-1 1:0 2:6 3:12
-1 1:1 2:4 3:8
 1 1:9 2:7 3:5
 1 1:14 2:7 3:0
 1 1:6 2:9 3:12
 1 1:8 2:9 3:10

The first line means inputs (4, 5, 7) have class “-1” and so my data has eight training items.

After I wrote a demo program, my first attempt at running it failed until I right-clicked on component libsvm.dll in the Solution Explorer window and made it “Copy to Output = Copy always”. Additionally, the libsvm.dll core component is a 32-bit binary so I had to change my Project Properties from “Any CPU” to “x86”.

SVM systems are extremely complicated. When they work well, they work very, very well. But they’re difficult to use because you have to select a kernel function (typically radial basis function or polynomial) and then experiment with a lot of parameters such as Complexity, Gamma, Coef0, and Degree.

SVMs were popular in the late 1990s before the ascendency of neural techniques. I had the good fortune to work with some of the researchers who pioneered SVMs. Engineers like me seek simplicity. But researchers seek complexity. The more complex an idea is, the more interesting it is from a research perspective and there are more opportunities to write research papers that incrementally add a small, new idea. The basic ideas of SVMs are simple and elegant but the details of SVMs are heinously complex — SVMs are incredibly fascinating.



I think my love of math and computers started early on. I loved playing cards and the idea that the simplicity of 4 suits plus 13 ranks could lead to beautiful and amazing complexity.


using System;
using System.Collections.Generic;
using LibSVMsharp;
using LibSVMsharp.Helpers;
using LibSVMsharp.Extensions;

namespace SVM_CSharp3
{
  class Program
  {
    static void Main(string[] args)
    {
      try
      {
        Console.WriteLine("\nBegin SVM libsvmsharp demo \n");
        SVMProblem trainData =
          SVMProblemHelper.Load("..\\..\\dummy.txt");

        // Set the parameter set
        SVMParameter parameter = new SVMParameter();
        parameter.Type = SVMType.C_SVC;
        parameter.Kernel = SVMKernelType.POLY;
        parameter.C = 1.0; parameter.Gamma = 1.0;
        parameter.Degree = 2; parameter.Coef0 = 0.0;

        Console.WriteLine("Starting training");
        SVMModel model = trainData.Train(parameter);
        Console.WriteLine("Training complete");

        // Save the model
        SVM.SaveModel(model, "..\\..\\dummy_model.txt");

        // Make predictions
        double[] predictions = trainData.Predict(model);

        // Evaluate the model
        int[,] CM;
        double acc =
          trainData.EvaluateClassificationProblem(predictions,
          model.Labels, out CM);
        Console.WriteLine("\nModel accuracy on training data: "
          + acc.ToString("F2") + "%");
        Console.WriteLine("\nConfusion matrix:\n");
        Console.Write(String.Format("{0,6}", ""));
        for (int i = 0; i < model.Labels.Length; i++)
          Console.Write(String.Format("{0,5}", "(" +
            model.Labels[i] + ")"));
        Console.WriteLine();
        for (int i = 0; i < CM.GetLength(0); i++)
        {
          Console.Write(String.Format("{0,5}", "(" +
            model.Labels[i] + ")"));
          for (int j = 0; j < CM.GetLength(1); j++)
            Console.Write(String.Format("{0,5}", CM[i,j]));
          Console.WriteLine();
        }

        int[] sv_indices = model.SVIndices;
        Console.WriteLine("\nSV 0-based indices: ");
        for (int i = 0; i < sv_indices.Length; ++i)
        {
          Console.WriteLine(sv_indices[i] - 1 + " ");
        }

        Console.WriteLine("\nSupport vectors: ");
        List list = model.SV;
        for (int i = 0; i < list.Count; ++i)
        {
          var zz = list[i];
          for (int j = 0; j < zz.Length; ++j)
          {
            Console.Write(zz[j].Value + " ");
          }
          Console.WriteLine("");
        }

        Console.WriteLine("\nModel coefficients:");
        List coefs = model.SVCoefs;
        for (int i = 0; i < coefs.Count; ++i)
        {
          var c = coefs[i];
          for (int j = 0; j < c.Length; ++j)
          {
            Console.Write(c[j].ToString("F6") + " ");
          }
          Console.WriteLine("");
        }

        Console.WriteLine("\nModel constant:");
        double[] b = model.Rho;
        Console.WriteLine(b[0].ToString("F6"));
      } // try
      catch (Exception ex)
      {
        Console.WriteLine(ex.Message);
        Console.ReadLine();
      }
      Console.WriteLine("\nEnd demo ");
      Console.ReadLine();
    } // main
  } // program
} // ns
This entry was posted in Machine Learning. Bookmark the permalink.