Development

This slideshow requires JavaScript.

After I completed the sketches and working out the interactions I then went to Adobe XD to mark out the layout I wanted for the interface uses minimal images and designs so I could re-organise or make adjustments when needed.

This slideshow requires JavaScript.

After I had settled on the format I started inserting images that I would use and fleshing out some the information text and descriptions.

Web 1920 – 3Reference

 

I then went to develop the look testing how I wanted the “message boxes” to look like and the surrounding layout and design moving it away from being a wire-frame into a designed and stylised interface.

 

This slideshow requires JavaScript.

After finalising the aesthetic I could then apply the style to the other slides in the wire-frame and inserting more context and make it seem like a real tangible interface for someone to go through. Furthermore, I could then work on the smaller aesthetic details or interactions such as I trailed with different areas for a “further details” button and how I wanted that to look like. Ultimately I shifted it to a “glossary” button to serve as a quick look-up for those uses the interface.

Web 1920 – DETAILS HOMEWeb 1920 – OPEN GLOSSARY

Shown below is one of the process variants of how the user maneuvers to the screen shown above.

This slideshow requires JavaScript.

Full Adobe XD “layout” of all the slides.

DEVELOPMENT2.JPG

Start of XD Prototype.

If I could develop this further I would want to move this out of XD onto its own website with it further developed with animations and sound.

Breathing life into the Design

After consolidating the wire-frames and overall structure I could then move onto designing the ‘feel’ and look of the ‘Voight Kampff Test’. However, I wasn’t sure where to start when shifting from the wire-frames as I’m not too confident in my design taste. However, after looking at current interactions – current science fiction interfaces – and older science fiction interfaces. I had a rough visual image of where my system would be and look like.

This slideshow requires JavaScript.

Older Science Fiction Interfaces – where quite vibrant but very bold and looked very much like a ‘console’ however, as mentioned before they were designing with the current design paradigms and limitations of the time.

This slideshow requires JavaScript.

‘Current’ UI Design and interfaces are all generally minimal and clean.

NIERUI.JPG

A main source of inspirations was from the game NieR:Automata where you take on the character of a Android as you battle with these machines.

The world of NieR:Automata leans heavily towards science fiction, but the previous title in the series (NIER Replicant/Gestalt) had more of a fantasy aesthetic. When I put together NieR:Automata’s UI concept, I tried to make its sci-fi elements seem like a natural extension of the first title’s fantasy.

I had several visual notions to work with: The previous game’s UI; conventional sci-fi aesthetics; 2B herself; and the general idea of a luxurious and decadent world. Mixing all of those aesthetics together, I arrived at my key concept for NieR:Automata’s UI: a design that was systematic and sterile, but also beautiful. At that time, I felt strongly that the best way to convey this would be to avoid ornate decoration and focus on giving it a clean, graceful and flat design. But I soon realized that if I stuck too close to that idea, the final design would end up flavorless.

This slideshow requires JavaScript.

Contextualising the ‘Test’

I began to rapidly sketch out a variety of designs however, it was not until further discussion It developed into a concept that I think worked well. Creating a ‘Voight-Kampff-Test’ of sorts where people go through this simulated ‘test’ to identify these ‘Deepfakes’ through trial and error. Upon further discussion, we noted that although the wireframe and user-flow were well defined, I had to now develop it further from looking less like a wireframe and more like a coherent design. However, what was missing was the context of which this ‘Voight Kampff Test Program’ situated within.

This slideshow requires JavaScript.

Due to the source material and inspiration being fictional, and the concept being speculative, the ‘client’ itself could also be fictional speculative organisation. For this reason, I pitched an organisation where in the future, Deepfakes are widespread and this company has set up to counter malicious and harmful use of this technology, like misinformation, impersonation, or fabricating false identities. Furthermore, the test is designed for the potential employees of the company, applying for the lowest paying job for manual labour of going through and identifying these fakes. This testing program trains applicants into learning how to spot these fakes and upon completion can be processed through. Harking back to the futuristic but bleak setting of Blade Runner where although technology has advanced for the average person the conditions and work quality is poor at most.

‘Welcome to XYZ Corp potential candidate for our ‘Identification Initiative’ where you’ll be on the forefront of countering impersonation and false identities. As a member of our large team 5000 members strong (and counting) we’ll be relying on your keen skills in identification to fight back the threat. Complete our testing program to see if you have what it takes to join our large family at XYZ Corp’

vlcsnap-2019-01-25-12h52m08s389

vlcsnap-2019-01-25-12h56m30s797.png

Design Wire-frames

I sketched out a rough wire-frame before mocking it up in experience design, the format I had envisioned could both be a physical with a combination of digital or purely a digital experience, creating the experience of a ‘Blade Runner-esk’ detective scenario.

Physical Version

The physical version could be a set of printed out Deepfakes mixed in with real photos, looking like ‘Polaroids’. These could then be inserted into a machine which ‘reads and analyses’ these images and gives back the operator some details on which image is fake or which is real.

Image from iOS (22)

Moreover, the technical side of this concept I discussed with my tutor that for the physical version. A section of the table could be cut out with a acrylic area with a transparent surface – this would then be closed with a slit for the ‘Polaroids’ to be inserted into. This would then have a camera placed underneath with on the back on each of the ‘Polaroids’ would have a QR or bar-code that would correspond to its ‘digital’ counterpart that would get ‘scanned’. Additionally creating the illusion that these have been ‘digitally scanned’ and analysed which would blend into this digital interaction.

Image from iOS (23)

Wire-frame Sketches

I sketched out the dual process that would how both the physical and digital version work and where the functionalities of either combined mapping out how the digital and physical interactions would work.

 

This slideshow requires JavaScript.

XD Wire-framing

The firs iteration of the sketch just involved a rough wire-frame of how the interaction would be displayed and what I wanted each page to involve. However, when creating this I noticed that I accidentally informed the user that these are deepfakes when I actually wanted to leave them in the dark until these images where analysed then they would be informed of which are deepfakes or real people and the task would begin.

This slideshow requires JavaScript.

 

 

Design Development

After discussing my project and the concepts I had in mind for the project I found myself having the idea finally clear and well defined. However, I found myself unable to progress from it just being a well defined “concept” and wasn’t sure how I should realise it as a tangible thing that exists in reality.

I went back looking over how the various projects and discussions I had with people lead me to finalise this concept in particular my interest in fiction and the science-fiction genre of film.

I went and looked back at Blade Runner the focus of the project exploring the relationship of the ‘Voight-Kampff’ machine and how it was used in the universe of the film. Within the film, it is operated by a ‘Blade Runner’ to aid in helping identify whether or not a person is or isn’t real. Looking closely at the machine it focuses on a specific part of the body (the eye in the case of the film) so the operator can focus on it during the process, as it can also detect things that the operator cannot.

Image from iOS (17)

Much like how it is ultimately a tool in aiding identification it is ultimately up to the Operator – the ‘Blade Runner’ to decide and make a intuitive assumption based on their knowledge. I wanted to develop and make this a foundation in the concept. This ‘Voight-Kampff’ of today would be a “mock” tool that would aid in people to intuitively identify deepfakes through a learning process liken to a game.

Image from iOS (18)

I looked back over the small task I created which during the ‘In-Progress-Exhibition’ looking at what happened and what I could have done to improve upon it. The setup involved 4 examples with faces of real people and these computer generated deep fakes – the goal was to see if people could identify the differences using another shoot with some hints to spot them out. However, I found out that people weren’t using the sheet as intended, as I think partially due to the way I had arranged and the information I wrote made it seem like some further description rather than help. Whilst, the help sheet didn’t work as intended I noticed that people where starting to pick up clues as I gave them hints or examples and guided them through.

Image from iOS (19)

Concept

This first concept played around with what I had presented for the exhibition where someone would be presented with a range of deepfakes and real people like in the task, and they were simply given the task of picking 4 without knowing what would happen.

These would then be “analysed” detailing the specifics of whether these were ‘fakes’ (Deepfakes) or real or not explaining why and how they are fakes and giving the user some idea of how to tell them apart.

The user then went back to the previous page this time with further advice, and had to select 4 ‘fakes’. Using the knowledge they had acquired and as mentioned before ‘intuitively identify them’ all correctly – this would teach them how to differentiate between them.

Once they had selected 4 correct – they would then be presented with some further information on deepfakes and the inspiration behind the project in particular highlighting that this was only once specific area of this ‘Deepfake Technology’.

Image from iOS (20)

 

Image from iOS (21)

The second concept is extremely similar to the first however, instead of selecting from a wide range you’d select from 5 fakes and real images of varying difficulties to identify.

Which would then lead to the similar process of going back to identifying which images are fake or not, however over 4-5 different task and guessing between 1-2 fakes each.

Work-In-Progress Exhibition

@hugFMP work-in-progress exhibition

We held a work-in-progress exhibition in the studio where students and staff from IID and other affiliate courses such as IDA and MA UI/UX where invited to see our progress. It had to be a prototype, experiment or manifestation of our concept – rather than a presentation of your research.

For the exhibition we had to explain our concept in a small paragraph, the paragraph I finalised for the exhibit was this;

We are increasingly presented with software that imitates human behaviour or appearance. AI generated imagery and conversational agents are more difficult to identify as non-human than ever. In the dystopian fiction Blade Runner human ‘replicants’ are subjected to a ‘Voight-Kampff’ machine, which helps an operator identify the synthetic humans. What would a ‘Voight-Kampff’ machine of today be and how could it be used?

IPExo.jpg
Work-In-Progress Exhibition Setup

The information tries to give some further information about what deep fakes are and how you can identify them. To help explain how in 5 or maybe 10 years time these have the means to become difficult to differentiate from a image of a real life human being.

Image from iOS (16)
Example key so I know which images are fake or real assigned with “F” (Fake) or “R” (Real)

Findings;

I found that most people couldn’t identify the imagery when presented with a mixed set of photos with real people and deepfakes.

PrintOutTest4

Only until presented with the hints or pointing out some of the issues and flaws with these ‘Deepfakes’ were then people able to identify and differentiate between them.

For example the image second from the top left – one noticeable weird generation is the hood merging into the background. This can also be seen in the bottom left,  which people can use as a clue to start pointing out other deepfakes making it easier. In contrast to the bottom right image, looks like a real person, however, one noticeable issue with it is a blur in the hair on the right. Although this could be noted as a print image – as I had originally printed them out onto paper for demonstration.

I had some suggestions about perhaps ‘gamify’ the experience so people could learn how to recognise and identify deepfakes.

 

Further Research into Machine Learning and DeepFake Technology

Further links and research into the area around AI and Deepfakes:

 

AI Object recognition using training data to single out individual items.

 

 

NVIDIA Research

DeepTrace

Can AI Detect Deepfakes To Help Ensure Integrity of U.S. 2020 Elections?

DeepTrace

DeepVoiceFakes

This AI lets you deepfake your voice to speak like Barack Obama

AIFakeTexts

New AI fake text generator may be too dangerous to release, say creators

https://aibusiness.com/recaptcha-trains-google-robots/

What would the ‘Voight-Kampff’ Machine of Today be like?

In Ridley Scott’s depiction of Blade Runner, set in the alternative dystopian year of 2019, what would realities version actually look like in the year 2019.

The ‘Voight-Kampff’ Machine in the movies narrative was to assist in the detection of identifying ‘Replicants’ the synthetic androids of the world. The machine in the words of Syd Mead was made to look like it was “breathing because it would ‘inhale’ localised air between the interviewer and interviewee and process that and pick up acidic traces  and so-forth to try and detect the Replicant – mimicking that of animals as they can ‘smell if you’re afraid’.

This also paired up with the questions to form the ‘Voight-Kampff Test’  It measured bodily functions such as respiration, heart rate, blushing and eye movement in response to emotionally provocative questions.

Blade Runner Wiki

Description from the original 1982 Blade Runner presskit:

A very advanced form of lie detector that measures contractions of the iris muscle and the presence of invisible airborne particles emitted from the body. The bellows were designed for the latter function and give the machine the menacing air of a sinister insect. The VK is used primarily by Blade Runners to determine if a suspect is truly human by measuring the degree of his empathic response through carefully worded questions and statements.

The Voight-Kampff machine is perhaps analogous to (and may have been partly inspired by) Alan Turing’s work which propounded an artificial intelligence test — to see if a computer could convince a human (by answering set questions, etc.) that it was another human. The phrase Turing test was popularised by science fiction but was not used until years after Turing’s death.

‘Determining if something is truly human’ what would the tool be used to determine and identify this and what would the ‘subject’ (instead of synthetic being/AI) be?

What is today’s ‘subject’?

Deepfake Technology

Things that we usually have to distinguish and determine whether something is human can range from fakes to real or digital fakes examples of these can be from primitive disguises to digital fakes more commonly known as a ‘Deepfake’. Using ‘artificial intelligence’ a person can combine and superimpose existing images and videos onto source images of using a machine learning technique called a ‘generative adversarial nertwork’, GAN for short. These have resulted in videos that can depict a person or persons saying or performing actions that have never actually occurred in reality.

An example is when Jordan Peele and Jonah Peretti created a Deepfake using Barack Obama as a public service announcement about the danger of Deepfakes shown below in April of 2018.

Artificial Intelligence or AI

Artificial Intelligence has been broadly defined as the theory and development of computer systems able to perform tasks normally requiring human intelligence. Examples such as visual perception, speech recognition, decision-making, and translation between languages. 

But arguably this level of Intelligence is still the realms of ‘Sci-Fi’ and fictional. Additionally, this is also known as Artificial general intelligence (AGI) the intelligence of a machine that could successfully perform any intellectual task that a human being can. Realities version could be referred to as ‘weak AI’ or ‘narrow AI’ the usage of software to study or accomplish a specific problem solving or reasoning tasks. Weak AI, in contrast to strong AI, does not attempt to perform the full range of human cognitive abilities.

GANS and Deepfakes

These Deepfakes use a subset of machine learning which in itself is a subset of AI however, it should be noted that all machine learning counts as AI, but not all AI counts as machine learning.

Deepfakes are created using machine learning in particular GANs.

The basic components of every GAN are two neural networks – a generator that synthesises new samples from scratch, and a discriminator that takes samples from both the training data and the generator’s output and predicts if they are “real” or “fake”.

GANs

The generator input is a random vector (noise) and therefore its initial output is also noise. Over time, as it receives feedback from the discriminator, it learns to synthesise more “realistic” images. The discriminator also improves over time by comparing generated samples with real samples, making it harder for the generator to deceive it.

ProGAN.gif

What Now?

Although we don’t have synthetic beings able to walk talk and mimic our actions, the digital technology is getting close to being able to create digital versions of ourselves even mimicking our voices or even creating fairly accurate images of people who don’t even exist.

https://thispersondoesnotexist.com/

Final Shift and Concept Ideation

After discussing over these various strands of concepts and interest points I had identified during the project and prior ones. I identified that whilst I had all these different idea’s and concepts I still had a very zoomed out view and had yet to still focus on a sole idea despite the variety of areas and topics I had found.

However, while discussing my initial desire to design or create some kind of ‘sci-fi’ themed interface we explored the option of focusing on a sole technology within a sci-fi film. As explored before in the prior project, we looked again at Blade Runner which I had done a lot of research on looking in particular the Voight-Kampff Machine.

The Voight-Kampff test was a test used as of 2019 by the LAPD’s Blade Runners to assist in the testing of an individual to see whether they were a replicant or not. It measured bodily functions such as respiration, heart rate, blushing and eye movement in response to emotionally provocative questions. It typically took twenty to thirty cross-referenced questions to distinguish a Nexus-6 replicant.

Syd Mead the concept designer for the film, describes how he envisioned what the machine should look and feel like.

“The biggest challenge was the Voight-Kampff – Deckard brings it with him into Tyrell’s office and so-forth so it had to be a suitcase briefcase sized thing. and in my mind it had to be terrifying. and this machine was breathing because it would ‘inhale’ localized air between the interviewer and interviewee and process that and pick up acidic traces  and so-forth much as animals do because animals can smell if you’re afraid.”

This concept of the using a machine to identify something that is near perfect “replica” of us, using a machine to aid in identifying and distinguishing something for what it is.

What would the ‘Voight-Kampff machine’ of today look or operate like?

If we explore the subject of what is a ‘replicant’ also known as androids they are synthetic robot beings that appear as organic beings.

Or within the context of the film they are defined as;

replicant was a synthetic, bio-robotic being with para-physical capabilities and designed to resemble a living, organic being. It was a genetically engineered being “composed entirely of organic substance” and marketed for labor by Tyrell Corporation and its successor, the Wallace Corporation.

Replicants are sometimes referred to as “skin-jobs,” as their likeness is often skindeep. Replicants consider the term a slur.

118full-blade-runner-screenshot

 

Creating Options

This slideshow requires JavaScript.

I created a set of ‘Options’ to explore potential interest points with my subject.

Option 1
Science Fiction envisions alternative or distance worlds, what would happen if
we incorporated these technologies and interfaces into present-day, every-day life?

Links:

UNINVITED GUESTS – 2015

Option 2
Tech Companies and products use blue to show innovation and advancements in designs. However, are they just using tropes to seem ‘leading edge’ when the reality is mundane.

‘A Mundane Science Future’

Links:
Taxodus – 2013
Microsoft Productivity of Future Vision
A Day Made of Glass

Option 3
Science Fiction interfaces for Film are designed to look flashy mostly to quickly provide brief quick context for narrative. Science Fiction interfaces design for UI within games require more consideration for keeping a usable interface but maintaining the look of Sci-Fi.

Links:

UI Design in NieR:Automata- 2017.08.25

Option 4
‘Future of the past’ The Shift of Visions of past Science Fictions; Re-imagining the imagined.

7