Wednesday, 23 November 2016

Tay.AI: Microsoft's Disastrous Mistake in Artificial Intelligence



As many may know, on March 23rd 2016 Microsoft launched an AI Chatbot known as Tay.AI. She was compatible with various social media platforms such as Kik, but most notably Twitter. It was aimed at young adults around the ages of 18 to 24. The chatbot was programmed to imitate the language of a typical teenage girl. However what was unique about Tay was that she was programmed to learn and speak from the people she interacted with. Whilst this was an innocent intention of Microsoft's, things took at dark turn when trusted with the general public, specifically the hands of rebellious juveniles. It was only 16 hours before Tay has to be taken offline as a result of inappropriate behaviour that she had been exposed to by users pulling practical jokes on the AI. It had become apparent that Tay had a big weakness in her system, and people were exploiting it. She had been used to make very controversial statements including racism, sexism, almost every sensitive topic in the book.

It is a frequent question as to why and how this once humble Artificial
l\ Intelligence has crashed and burned so badly. The answer is quite simple: It has no sense of what can and cannot be said, likewise what is and is not appropriate.

Furthermore, people typically ask if Tay will ever be able to be released to the general public one more. The answer to this however, is slightly more complicated. What is necessary for the bot not to make the same mistake again, is to program a method of filtering out certain topics. The most simple option is to block certain keywords from Tay's programming as they are inappropriate. While this is clean and simple, people are not too fond of this idea, as the purpose of artificial intelligence is to learn and in doing this she would not be building upon anything. The more complex answer would be to have Tay learn what subjects are and are not appropriate. This is a hard and time-consuming feat to pull off however not impossible and indeed innovative.

A misconception people often have is that Tay is the first Artificial Intelligence of this kind. However that is incorrect. Microsoft currently have chatbot just like Tay in China dubbed 'XiaoIce', who is some what of a "sibling" to Tay. Unlike Tay, Xiao was a massive success and is still running until this day. Because of Xiao's popularity in China, Microsoft wondered how a similar Artificial Intelligence would perform in a totally different culture. Hence the creation of Tay.AI.

While it is unlikely we will hear anything from Tay any time soon, Microsoft are looking to expand upon their chatbot using suggestions from online forums and and other similar forms. All we can do now in the meantime is hope for the return of Tay.AI, and that she will be a full-functioning artificial intelligence like XiaoIce some time in the near future.

Relvis Dance Off Evaluation

For the past few weeks group 2.3 have been programming a dance for the dance off assessment task. The task was to program a dance for the robot to the song PPAP with various actions.

While the task was a general success, we encountered a fair amount of challenges.

The first issue we has was with the light sensor. When we added the light sensor to Relvis no matter what we tried it did not work. This meant that we weren't able to use it to stop it from falling off the table. As well as the Light sensor, we also had problems with the ultrasonic sensor, and even at times the touch sensor. The ultrasonic sensor was not detecting obstacles around it, meaning it would crash into objects. While the touch sensor worked well most of the time, we had a few problems getting it to work at the start. When we pressed the button it did not work.

Another complication we had was with the movements the robot had to do. We attempted to do a triangle, however it resulted in various other shapes. Another movement we attempted to program was a zig-zag. However this failed too, and resulted in peculiar patterns. We never successfully pulled either of these movements off, but we usually kept what we had ended up with because they were not terribly bad moves.

The final complication we encountered happened when presenting, the movements were not rhythmic all the time, and at the end of the routine Relvis drove into a wall. Though it was fairly humorous, the routine did not go as planned. We fixed this when recording by adjusting the position of Relvis so he would not collide with the wall.

video

Sunday, 28 August 2016

Avatar Task



This avatar represents me in appearance, as I have brown eyes and hair.

This software allows you to make your own character and customise it with different colours, hair styles, eyes, etc.

The downside to this software is that you cannot manoeuvre the character too well, it is most a fixed position.




This avatar represents me in appearance also, from glancing at this avatar it could be suggested that this "character" is not terribly outgoing or extroverted.
With this software you are able to customise your character/s in a broad amount of ways and you are able to tweak things with quite minutiae detail. This software, unlike the previous one, allow you to move the character quite a lot.
What I dislike about this software is that you can only make group photos of up to 9 characters total. Another limitation this software has it that it can have NSFW elements.

This avatar represents the more friendly side of myself, I believe.
I like the art style this software has and the options it offers.
I dislike the lack of variety in options, there are only about 15 options per section.