In this post on we will build upon the work covered in the previous posts in this series to build in Sentiment Analysis functionality to our library using Azure Cognitive Services
From the official site, Azure Cognitive Services enables you to ‘Infuse your apps, websites and bots with intelligent algorithms to see, hear, speak, understand and interpret your user needs through natural methods of communication’ As part of their offering under ‘Text Analytics API’ is Sentiment Analysis functionality allowing you to extract information from text. You can demo this service online here
This will be our first look at the Azure Portal as everything to date has been within the ‘Azure DevOps Portal’
Login to your Azure Portal here. We are going to complete several tasks to get the Text Analytics API setup and ready to use within our library. We will setup a Resource Group for our project that will contain all the services covered in this blog post then configure the Text Analytics API.
From the dashboard click ‘Resource Groups’ from the main menu then ‘Add’. Select your subscription, give your resource group a name (eg SentimentAnalysisDemo) and finally select a region, I’m going to leave it at the default ‘Central US’. Finally Click ‘Review + Create’ to review and ‘Create’ at the bottom to confirm the changes.
It may take several moments for the resource group to setup and become available from your dashboard.
Once the resource group has been setup and is listed under the resource groups section, click on the newly created group to view the blade then either click ‘Create resources’ or ‘Add’ from the top menu.
This will launch the Marketplace allowing you to search all of the available apps and services on Azure. Enter ‘Text Analytics’ into the search box and press enter. The Text Analytics API should now be listed, select it from the list and click ‘Create’ from the blade that appears on the right
Give your resource a name, select your subscription, location and pricing tier and click ‘Create’
Once the deployment completes we will now have our Text Analytics API up and running. Navigate to the blade for our new resource (it will now be listed under the resources section of your resource group). The only thing we need from here is our API keys.
Copy one of these keys to notepad, we will use this in the next part of this post.
Now we have our Text Analytics API up and running on Azure, we now want to implement the functionality in our library.
Open VS Code on your machine and create three new folders under the TwitterSentiment project for TextAnalyticsClient, TwitterClient and Extensions.
Copy Tweet.cs and TwitterClient.cs into the TwitterClient folder. Now create the following classes under the TextAnalyticsClient folder
The first three files are just models for the Text Analytics API and a DTO class to combine our Tweets and Sentiment Analysis result.
AnalysisResult.cs
using System.Collections.Generic;
namespace TwitterSentiment
{
public class AnalysisResult
{
public List<DocumentAnalysis> Documents { get; set; }
public List<Error> Errors { get; set; }
}
public class DocumentAnalysis
{
public string Id { get; set; }
public double Score { get; set; }
}
public class Error
{
public string Id { get; set; }
public string Message { get; set; }
}
}
Document.cs
using System.Collections.Generic;
namespace TwitterSentiment
{
public class DocumentWrapper
{
public List<Document> Documents { get; set; }
}
public class Document
{
public string Id { get; set; }
public string Language { get; set; }
public string Text { get; set; }
}
}
TweetsWithSentiment.cs
namespace TwitterSentiment
{
public class TweetsWithSentiment
{
public string Id { get; set; }
public string Text { get; set; }
public double Score { get; set; }
public TweetsWithSentiment(string id, string text)
{
Id = id;
Text = text;
}
public TweetsWithSentiment(string id, string text, double score)
{
Id = id;
Text = text;
Score = score;
}
}
}
Extensions.cs
using System.Collections.Generic;
using System.Linq;
namespace TwitterSentiment
{
public static class Extensions
{
public static List<Document> ProjectToDocuments(this List<Tweet> tweets)
{
var docs = new List<Document>();
tweets.ForEach(t=> docs.Add(new Document { Id = t.Id, Text = t.Text, Language = "en" }));
return docs;
}
public static IEnumerable<TweetsWithSentiment> Combine(this List<Tweet> tweets, AnalysisResult sentiment)
{
var response = new List<TweetsWithSentiment>();
return tweets.Join(sentiment.Documents,
t => t.Id,
s => s.Id,
(t, s) => new TweetsWithSentiment(t.Id, t.Text, s.Score));
}
}
}
The Extensions class above provides a couple of extension methods to project a list of Tweet objects into a generic list of Document objects for use with the Text Analytics API and to combine the results of the Sentiment Analysis with the source Tweets.
TextAnalyticsClient.cs
using Microsoft.Extensions.Configuration;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading.Tasks;
using System.Web;
namespace TwitterSentiment
{
public class TextAnalyticsClient
{
private readonly IConfiguration _config;
public HttpClient _client { get; }
public TextAnalyticsClient(IConfiguration config, HttpClient client)
{
_client = client;
_config = config;
_client.BaseAddress = new Uri("https://westus.api.cognitive.microsoft.com/text/analytics/v2.0/");
_client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", _config["AzureTextAnalytics"]);
}
public async Task<AnalysisResult> AnalyzeSentiment(List<Document> documents)
{
using (var content = new StringContent(JsonConvert.SerializeObject(new DocumentWrapper { Documents = documents })))
{
content.Headers.ContentType = new MediaTypeHeaderValue("application/json");
var response = await _client.PostAsync("sentiment", content);
response.EnsureSuccessStatusCode();
return await response.Content.ReadAsAsync<AnalysisResult>();
}
}
}
}
The TextAnalyticsClient class above is another Typed Client similar to the TwitterClient.cs we created in part 2 that makes a POST request to the Text Analytics API with a list of Documents (containing the Tweets) as the payload and that expects the analysis results from the response.
One final class we will add will simplify how we register our library in our ASP.NET Core Web Application. The extension will allow us to register our library services in the ASP.NET Core Startup using the following code
services.AddTwitterSentiment();
Under the Extensions folder create the following class
ConfigureSentimentAnalysis.cs
using Microsoft.Extensions.DependencyInjection;
using TwitterSentiment;
using TwitterSentiment.OAuth;
namespace Microsoft.Extensions.DependencyInjection
{
public static class ConfigurationExtensions
{
public static IServiceCollection AddTwitterSentiment(this IServiceCollection services)
{
services.AddTransient<OAuthMessageHandler>();
services.AddHttpClient<TwitterClient>()
.AddHttpMessageHandler<OAuthMessageHandler>();
services.AddHttpClient<TextAnalyticsClient>();
return services;
}
}
}
Note in the above code we have registered our TwitterClient and TextAnalyticsClient using the generic AddHttpClient method which is part of the new HttpClientFactory feature we discussed in part 2 of this blog post series.
Once all of our new files have been added your project structure should be as follows
Open your command prompt in the root of your application and run the following to build you latest changes
cd TwitterSentiment
dotnet build
If everything has worked up to this point your build should succeed.
Before we push our latest changes to GitHub we will add one additional Unit Test to test our AnalyzeSentiment method on the TextAnalyticsClient
Create a new TextAnalyticsClientShould.cs file next to the TwitterClientShould.cs class in the Test project with the following code
using Microsoft.Extensions.Configuration;
using Newtonsoft.Json;
using NSubstitute;
using Shouldly;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using Tests.Fakes;
using TwitterSentiment;
using Xunit;
namespace Tests
{
public class TextAnalyticsClientShould
{
public TextAnalyticsClientShould()
{
}
[Fact]
public async Task Get_Sentiment_Analysis_Results_Using_TypedHttpClient()
{
var document1 = new Document { Id = "1", Text = "This is a really negative tweet", Language = "en-gb" };
var document2 = new Document { Id = "2", Text = "This is a super positive great tweet", Language = "en-gb" };
var document3 = new Document { Id = "3", Text = "This is another really super positive amazing tweet", Language = "en-gb" };
var result1 = new DocumentAnalysis { Id = "1", Score = 0 };
var result2 = new DocumentAnalysis { Id = "2", Score = 0.7 };
var result3 = new DocumentAnalysis { Id = "3", Score = 0.9 };
var documents = new List<Document> { document1, document2, document3 };
var results = new AnalysisResult { Documents = new List<DocumentAnalysis> { result1, result2, result3 } };
var fakeConfiguration = Substitute.For<IConfiguration>();
var fakeHttpMessageHandler = new FakeHttpMessageHandler(new HttpResponseMessage()
{
StatusCode = HttpStatusCode.OK,
Content = new StringContent(JsonConvert.SerializeObject(results), Encoding.UTF8, "application/json")
});
var fakeHttpClient = new HttpClient(fakeHttpMessageHandler);
var sut = new TextAnalyticsClient(fakeConfiguration, fakeHttpClient);
var result = await sut.AnalyzeSentiment(documents);
result.Documents.Count.ShouldBe(3);
result.Documents.ShouldContain(f=> f.Id == result1.Id && f.Score == result1.Score);
}
}
}
The above test is pretty similar to the TwitterClient test we wrote earlier however we are using the NSubstitute library to mock the IConfiguration parameter that gets injected into the TextAnalyticsClient. You can see from the syntax below its very easy to mock an object by calling Substitute.For<T>
var fakeConfiguration = Substitute.For<IConfiguration>();
Before our project will compile we need to bring in one additional package that contains the IConfiguration object we have just mocked and injected into the TextAnalyticsClient.
From the command line in the root of your project navigate to the TwitterSentiment.Tests project and add the Microsoft.Extensions.Configuration package (again we will pin to version 2.1.1)
cd TwitterSentiment.Tests
dotnet add package Microsoft.Extensions.Configuration --version 2.1.1
Once installed remain on your command line within the TwitterSentiment.Tests library and build then run our tests to ensure everything looks good
dotnet build
dotnet test
Your output should show 2 tests have now passed
Thats our library now completed and ready for use in our Web Application. We just need to push our latest changes to GitHub to trigger a new CI/CD and create our updated NuGet package.
From your command prompt, navigate to the root of your project and run the following
git add .
git commit -m "Add Sentiment Analysis Functionality"
git push
This should push our changes and trigger a new CI/CD build. Once the build completes you should have an updated package listed under your NuGet feed.
We are now ready to build our Asp.Net Core Web Application using Blazor in the next blog post!