Category Archives: UX Strategy

Digital Transformation is levelling the playing field

It’s a buzzword that’s been around for some time now. Digital transformation is now considered the necessity and could well be the key to reshaping the way corporations work. Transformation is challenging, but it has to be tackled now to avoid being left behind. And like any challenge, it brings opportunity – a chance reshuffle the deck and comes out with a stronger hand for the future.

What are the three areas leaders need to consider to have a winning hand?

Any industry leaving analogue: “It’s not you, it’s me”

Let’s face it, technologies are incurably present in our daily lives. Even the most “offline” professionals can do better, faster, cheaper with the use of innovation. From the local pub that is learning it can attract more customers using social media to your doctor relying on 3D imagery to assist in diagnosing your knee injury. It is hard to think of an industry that is totally oblivious to the new advancements in technology.

Just think about all the technological devices you rely on to do your day to day work?

78055a3545a97f1a073b9ad6887e6c73

Your customers won’t wait for you to improve

The power is shifting to customers now. They have more options. There is little to no switching costs or penalties these days, so your customer is never really only yours.  They also research products and look for digital social cues to establish emotional connections with a brand long before purchasing.

With so many channels (especially digital ones), a profound qualitative understanding of your customer base is no longer a nice to have, but a necessity. At the touch of a finger, customers can compare your product to one from the start-up around the block or a multinational across the world.

Your customers are embracing a global playing field, so why not accompany them on the journey?  

Your employees are embracing new ways of working

For many companies, managing how technology influences employee productivity is the scariest challenge.

How can companies align internal operations to save costs and be efficient? And how can they do so in a landscape that is ever-shifting and evolving? Companies strive to incorporate frugal principles into their day to day world.

If the aim is to improve the user experience and customer experience for your audience, then internal experiences must also improve in equal alignment as well.

Guess what? There’s good news if you are ready to redesign your internal experience. The key is knowing that all change begins internally.  If your team lacks the skills or momentum, then external help from trusted experts can provide you with a flexible approach so that you can begin a UX change programme when you need it the most. Your team will learn whilst doing alongside experts, rather than just being stuck in planning mode for the next decade.

file000588169565

Well, digital transformation is coming your way, and digital transformation is here to stay. Any transformation will have internal and external impacts. See it as a magnificent opportunity to refine or reshape your services to ensure you stay ahead of the curve.

Putting customers first is the right step forward and having flexible internal processes is key for such a future.

Think of your employee experience as the hand you’re dealt initially, but you can change the cards and improve your hand by trying to deliver the best user experience possible.

Paul Sauvage

Paul manages client requirements and needs. He is an experienced, senior project manager successfully leading digital and technical transformation projects in areas such as user experience, user centred design, robotic process automation (RPA), and general change management. Well-rounded professional, supporting programmes focused on operational change and strategic improvements, guiding organisations toward sustainability and efficiency.

Why technology puts human rights at risk

Birgit Schippers, Queen’s University Belfast

Movies such as 2001: A Space Odyssey, Blade Runner and Terminator brought rogue robots and computer systems to our cinema screens. But these days, such classic science fiction spectacles don’t seem so far removed from reality.

Increasingly, we live, work and play with computational technologies that are autonomous and intelligent. These systems include software and hardware with the capacity for independent reasoning and decision making. They work for us on the factory floor; they decide whether we can get a mortgage; they track and measure our activity and fitness levels; they clean our living room floors and cut our lawns.

Autonomous and intelligent systems have the potential to affect almost every aspect of our social, economic, political and private lives, including mundane everyday aspects. Much of this seems innocent, but there is reason for concern. Computational technologies impact on every human right, from the right to life to the right to privacy, freedom of expression to social and economic rights. So how can we defend human rights in a technological landscape increasingly shaped by robotics and artificial intelligence (AI)?

AI and human rights

First, there is a real fear that increased machine autonomy will undermine the status of humans. This fear is compounded by a lack of clarity over who will be held to account, whether in a legal or a moral sense, when intelligent machines do harm. But I’m not sure that the focus of our concern for human rights should really lie with rogue robots, as it seems to at present. Rather, we should worry about the human use of robots and artificial intelligence and their deployment in unjust and unequal political, military, economic and social contexts.

This worry is particularly pertinent with respect to lethal autonomous weapons systems (LAWS), often described as killer robots. As we move towards an AI arms race, human rights scholars and campaigners such as Christof Heyns, the former UN special rapporteur on extrajudicial, summary or arbitrary executions, fear that the use of LAWS will put autonomous robotic systems in charge of life and death decisions, with limited or no human control.




Read more:
Super-intelligence and eternal life: transhumanism’s faithful follow it blindly into a future for the elite


AI also revolutionises the link between warfare and surveillance practices. Groups such as the International Committee for Robot Arms Control (ICRAC) recently expressed their opposition to Google’s participation in Project Maven, a military program that uses machine learning to analyse drone surveillance footage, which can be used for extrajudicial killings. ICRAC appealed to Google to ensure that the data it collects on its users is never used for military purposes, joining protests by Google employees over the company’s involvement in the project. Google recently announced that it will not be renewing its contract.

In 2013, the extent of surveillance practices was highlighted by the Edward Snowden revelations. These taught us much about the threat to the right to privacy and the sharing of data between intelligence services, government agencies and private corporations. The recent controversy surrounding Cambridge Analytica’s harvesting of personal data via the use of social media platforms such as Facebook continues to cause serious apprehension, this time over manipulation and interference into democratic elections that damage the right to freedom of expression.




Read more:
Should we fear the rise of drone assassins? Two experts debate


Meanwhile, critical data analysts challenge discriminatory practices associated with what they call AI’s “white guy problem”. This is the concern that AI systems trained on existing data replicate existing racial and gender stereotypes that perpetuate discriminatory practices in areas such as policing, judicial decisions or employment.

AI can replicate and entrench stereotypes.
Ollyy/Shutterstock.com

Ambiguous bots

The potential threat of computational technologies to human rights and to physical, political and digital security was highlighted in a recently published study on The Malicious Use of Artificial Intelligence. The concerns expressed in this University of Cambridge report must be taken seriously. But how should we deal with these threats? Are human rights ready for the era of robotics and AI?

There are ongoing efforts to update existing human rights principles for this era. These include the UN Framing and Guiding Principles on Business and Human Rights, attempts to write a Magna Carta for the digital age and the Future of Life Institute’s Asilomar AI Principles, which identify guidelines for ethical research, adherence to values and a commitment to the longer-term beneficent development of AI.

These efforts are commendable but not sufficient. Governments and government agencies, political parties and private corporations, especially the leading tech companies, must commit to the ethical uses of AI. We also need effective and enforceable legislative control.

Whatever new measures we introduce, it is important to acknowledge that our lives are increasingly entangled with autonomous machines and intelligent systems. This entanglement enhances human well-being in areas such as medical research and treatment, in our transport system, in social care settings and in efforts to protect the environment.

But in other areas this entanglement throws up worrying prospects. Computational technologies are used to watch and track our actions and behaviours, trace our steps, our location, our health, our tastes and our friendships. These systems shape human behaviour and nudge us towards practices of self-surveillance that curtail our freedom and undermine the ideas and ideals of human rights.

The ConversationAnd herein lies the crux: the capacity for dual use of computational technologies blurs the line between beneficent and malicious practices. What’s more, computational technologies are deeply implicated in the unequal power relationships between individual citizens, the state and its agencies, and private corporations. If unhinged from effective national and international systems of checks and balances, they pose a real and worrying threat to our human rights.

Birgit Schippers, Visiting Research Fellow, Senator George J Mitchell Institute for Global Peace, Security and Justice, Queen’s University Belfast

This article was originally published on The Conversation. Read the original article.

Ad Hoc London Team

Ad Hoc London explores audience needs in the UK. We routinely conduct UX and usability research in London, Southampton, Manchester, and Glasgow. We optimise information for laptops, tablets and smartphones so customers have the best possible user experience. We help clients benefit from understanding their audiences’ varying needs.

My trial and error experience of UX

UX is often referenced as a buzzword. In a world where Digital strategy is on every lip, where can we fit UX? Is it the ultimate solution for IT departments? Can it make our products better, faster, stronger without being harder?

I came around UX about three years ago when I started working for Ad Hoc Global. Because of my dyslexia, I continually made reference to User Experiment rather than User Experience for UX (It would agitate my managing director). However, the more knowledge on UX I acquired, the easier it was for me to justify it.

file1951263252912
“We have to improve our customer experiences.” How many times have you heard this during a pitch?

Throughout  life, a human being has both good and bad experiences. One thing that triggers these experiences are experiments. A risky action that moves one individual from a comfort zone to the unknown. Once you get there, the unexplored land becomes your experience, a unique selling proposition for most of the companies. “We have to improve our customer experiences.” How many times have you heard this during a pitch? Iterations through carefully designed experiments give fine-tuned insights into creating experiences. It can be browsing through your latest application or reading signs while driving.“What if I experiment following a sat nav rather than planning my trip ahead? Will my experience become more positive?” UX will make your experiments a success and your experience powerful.    

Renowned psychologist Daniel Kahneman writes that part of our brain makes quick decisions without using intense reflex efforts. Based on this, I see UX as a way to better utilize this part of the brain. To quote another successful writer, and another Daniel, D.H. Pink, we are in a caveat information situation where the user and the product have the same information at a precise instant. We perform actions knowing  what to expect. We are no longer lost with a product and prepared to make the next step in the unknown. Hence, users become the center of discussions. The focus shifts from what the technology allows us to do, to what we want to do in a particular situation. Features are optimised and through end eyes paths toward final goals are defined. The world becomes a two-way communication system with inputs from both sides.

In the end, heuristic reviews are performed, usability is improved, architectures become more intuitive, returns of investments maximise, strategies are in adequation with audiences, risks are managed. Your experiment is an achievement and experiences become memorable.
This is the power of UX.  

Paul Sauvage

Paul manages client requirements and needs. He is an experienced, senior project manager successfully leading digital and technical transformation projects in areas such as user experience, user centred design, robotic process automation (RPA), and general change management. Well-rounded professional, supporting programmes focused on operational change and strategic improvements, guiding organisations toward sustainability and efficiency.

My Experience with Sitemaps

I was eager to explore building sitemaps after being introduced to their benefits while learning about the web design process. However, I had an unconventional first experience: I was asked to create a sitemap after the content and navigation were decided. While after-the-fact planning is bad practice (particularly for a UX consultancy website), it was a learning experience in my skills development as a Junior UX Researcher. Most challenging was the structuring of each page within the sitemap in a way that would be understandable to the user. Because the sitemap was based on an active website, I had to do some UX reverse engineering.

figure 1: My First Attempt

On my first attempt, I made the mistake of including all headings and subsections in the pages of the website. In retrospect, its smart to include only the headings that user interacts with. This resulted in an unclear and fairly cluttered sitemap that did not wholly represent the navigational complexity of the website.

It was after this initial setback that I made a mental breakthrough. In order for the sitemap to accurately portray the website, each page link needed to be displayed in a way that was understandable and intuitive for the user.

At first, this task seemed simple as the website did not have a lot of content. Nevertheless, creating an accurate pattern for the sitemap that precisely mapped each website page proved an obstacle. After some experimentation, I discovered that the website pages were all linked in a way that formed a circular navigation pattern.

figure 2: Final Sitemap

After this discovery, I displayed the pattern in an easy to use map. As seen in Figure 2, all of the pages in the navigation bar are linked both to the ‘Homepage’ and the international team pages. (The team pages are listed in a drop-down menu in the navigation bar). The ‘Teams’ page links to ‘Services’, which links back to ‘Contact Us. Finally, the ‘Insights’ page stands on its own with links to the various social media accounts below.

This task has shown me the utility of sitemaps for both designers and users. Sitemaps can be used to help  a user navigate, or allow a designer to structure a meaningful navigation process. In essence, sitemaps are hierarchical models that break down content into specific areas as well as show the relationship between internet and external pages. I now understand how the process of building a sitemap is a fundamental skill for UX practitioners.  

Gary Maccabe

Junior UX Researcher with a background in psychology and social media management. My interest in UX design stems from my time studying Cyberpsychology and human cognition as part of my BSc. (Hons) Psychology degree. Although it is still early in my career I have provided many large organisations with UX services and solutions. This experience has given me a solid understanding of what good UX design is and how to deliver it.