Previously,we kickstarted our new article series “AI HowTo” and talked about how to register for Emotion API Preview(Project Oxford) through Azure Portal.
Now, I’d suggest we should start coding and learning the ways of emotions via codes
Before we start,make sure you get your “key” ready.If you havent already, read previous article to learn how it is done.
Emotion Analyzing in Images
This project requires an image to be processed at startup,makes some calculations on which emotion is stronger and then according to that calculation it posts the emotion on the screen.Let’s start with step by step
1-) Create a new Visual Studio project. Preferably Console Application.Install NewtonSoft JSON Parser from NUGET command line.
2-) Create a method to calculate the maximum number on a dictionary/collection:
static string ShowMeEmotion(Dictionary<string,dynamic> results)
KeyValuePair<string, dynamic> max = new KeyValuePair<string, dynamic>();
foreach (var kvp in results)
if (kvp.Value > max.Value)
max = kvp;
3-) Create a method to convert image to byte array.It’s needed while posting image as byte to Emotion REST API
static byte GetImageAsByteArray(string imageFilePath)
FileStream fileStream = new FileStream(imageFilePath, FileMode.Open, FileAccess.Read);
BinaryReader binaryReader = new BinaryReader(fileStream);
4-) Make a REST POST request to the emotion service url and parse json to display values from it
static async void MakeRequest(string imageFilePath)
var client = new HttpClient();
client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "Update here with your own Emotion API Key");
string uri = "https://westus.api.cognitive.microsoft.com/emotion/v1.0/recognize?";
byte byteData = GetImageAsByteArray(imageFilePath);
using (var content = new ByteArrayContent(byteData))
content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
response = await client.PostAsync(uri, content);
responseContent = response.Content.ReadAsStringAsync().Result;
dynamic d = JsonConvert.DeserializeObject(responseContent);
Dictionary<string, dynamic> emotions = new Dictionary<string, dynamic>()
5-) Call main method
static void Main()
OpenFileDialog ofd = new OpenFileDialog();
string imageFilePath = ofd.FileName;
Well done in Kick-Starting Azure Cognitive Services Emotion API
Remember that, Emotion API(Project Oxford) is still in “Preview Stage” , so not all your images are meant to work (Tried like 10 happiness emotion images and only 1 got processed)
Emotion Analysis is essential for all industries.We live in a world where emotions are changed instantly so if we analyze and take precautions before bad things happen, we can avoid dramas or even deaths.
There used to be emotion reading in police departments.These men/women could easily read one’s emotions in an interrogation.I don’t know if police stations upgraded to use AI but Artificial Intelligence can also help to avoid crime.
Emotion-reading AI spots if potential criminals are stressed or nervous and alerts police BEFORE they act
Read it from here: http://www.dailymail.co.uk/sciencetech/article-4494994/New-technology-reads-emotions-potential-terrorists.html
Analyzing Emotions must be a top priority for all companies and governmental organizations.
For governments; to secure public life and avoid terrorist,anarchist attacks.
For companies; to increase happiness among employees and solving their problems before it gets bigger.
Companies nowadays trying to convince employees not to leave by increasing their wage,giving car or improving working style but companies most of the time missing the point where problems occur.Emotion Analysis will also help managers to discover if employees are happy within the company or not.
So my advice to the companies/governments is;
You should absolutely plan for Emotional Test daily/weekly/monthly without others realizing it.
Future is AI, use it!