Tonic DigitalTonic Digital

  • Home
  • About
  • Specialisations
    • Not-for-Profit
      • Not-For-Profit Marketing
      • Not-For-Profit Grant
      • Social Impact Funds
    • For-Profit
  • Projects
  • Insights
  • Contact
  • Home
  • Data
  • Archive from category "Data"
  • Page 8

Category: Data

Value Creation via NFTs

Monday, 13 December 2021 by Tonic Digital

In April 2021, we wrote about the increasing awareness of NFT technology after Mike Winkelmann sold an NFT for artwork at an eye-watering $69 million. Since then, the use of NFTs has been expanding from the art world into areas unforeseen seven months ago.

Read more
  • Published in Data, ethics, NFT
No Comments

The Creeping of AI

Wednesday, 08 December 2021 by Tonic Digital

The world of AI is here and impacting our lives and influencing decisions in ways we do not consider. The only attention we give to AI is when there is an exciting breakthrough that makes the news or we watch another TV series  with scenarios of an AI apocalypse, yet AI impacts us every day, from what we see on social media, to what ads pop up as we scroll through Instagram, to the speeding fine we receive. It is all determined by AI.

What is AI?

AI is a computer program that uses complex code and processors to sift through massive amounts of data to make decisions and take actions. It is more accurate to speak of artificial intelligences (AI’s) because countless different programs fall under the umbrella of AI.

From the world of online dating, shopping to dealing with government departments, AI is becoming increasingly popular. In marketing, AI is being used to target advertising to people based on their past and present shopping and browsing preferences. Based on this data, predictive algorithms make inferences about choices we are likely to make in the future. It is estimated that these algorithms drive 35% of what people buy on Amazon and 75% of what they watch on Netflix.

Is AI as neutral and objective as we think?

The codes and processors that sift through a large amount of data are known as algorithms, which are the set of rules the computers follow to process the data and arrive at conclusions.  The algorithms follow the mathematical objective set by the designer. AI decisions are often viewed as the result of neutral, objective technology and therefore superior to decisions made by people.

The reality is that AI is, from start to finish, the product of human choice and decisions that are influenced by human biases, shaped by human values, and prone to human errors. This means AI is only as good as the data used to train it. If the data is incorrect, biased or of poor quality, the decisions and actions taken by the AI will also be inaccurate, biased and of poor quality.

Centrelink’s Robo-Debt system is a clear example of an assumption about the correctness of decisions made by AI despite evidence of the actual harm it was causing to people.

The recent investigation by US lawmakers into Facebook and other platforms use of algorithms to push emotional and toxic content that amplifies depression, anger, hate and anxiety is further evidence that algorithms are not the ethical neutral, objective technology we may have assumed.

Every automated action on the internet, from ranking content and displaying search results to offering recommendations, is controlled by computer code written by engineers who are often white, well-educated and affluent [12].

An example of this is in the employment area, where AI technology is utilised in hiring decisions. Men are often chosen over better-qualified women because of the program’s embedded gender bias, which AI reinforces [13].

Ethics and AI

AI will continue to impact our lives and the decisions we make. Hence the question is how AI can be used ethically and in ways that add value instead of being used destructively as in Robo-debt.  

  • Structural changes

Organisations developing and implementing AI need to ensure that the teams developing the program represent the wider community. This means having equal representation of women, people who have disabilities, and people from diverse cultural and economic backgrounds involved in program design to combat the unconscious biases when white, middle-class, educated males set algorithms.

  • Accountability

Who is accountable for automated decisions, and who are they responsible to [14]?  It is easy for the algorithm makers or the decision-makers who decided to implement an AI program to blame “the system”. However, as outlined in this article, the “system” has been generated by people who have biases, assumptions, and beliefs.  

Companies like Google have established principles for ‘responsible AI” to guide their practices, including considering issues like fairness, privacy, and security [15]. AI systems also need to be accountable to the end-user, that is, people who are affected by the decisions taken by the AI program. These decisions need to be easily explainable to the end-user; otherwise, the result will be an increase in distrust and a sense of alienation.

  • Remedy

When AI makes the wrong decision, there should be a transparent remedy process for affected people. As demonstrated with Robo-debt, the impact on people’s mental health when they feel they are battling an impersonal algorithm can be immense.

Technology is never neutral. While AI can make our lives easier and streamline our choices, we need to ensure that how AI is being used is inclusive and builds communities rather than alienating because of unconscious biases or assumptions in how programs are developed.

[1]  https://www.abc.net.au/news

[2] ibid

[3] https://theconversation.com/ethics-by-numbers 

[4]  https://theconversation.com/how-marketers-use

[5] ibid

[6] https://theconversation.com/ethics-by-numbers

[7]  https://www.abc.net.au/news 

[8] ibid

[9] ibid

[10] ibid

[11]  https://www.bloomberg.com/news/articles

[12] https://www.abc.net.au/news/ 

[13] ibid

[14] ibid

[15] ibid

Read more
  • Published in Data, ethics
No Comments

Facebook’s Corporate Citizenship

Tuesday, 09 November 2021 by Tonic Digital

The business pages of the West Australian on Wednesday, 27 October, reported Zuckerberg was riled as a consequence of the bad press received by Facebook due to the documents provided by Frances Haugen to US Congress and the Securities and Exchange Commission.

Read more
  • Published in Data, ethics, marketing, Wellbeing
No Comments

Understanding your data

Thursday, 26 August 2021 by Tonic Digital
Woman pondering with UV paint splattered on face

Do you know what your organisations digital business strategy is?  Are you clear about the value of a digital business strategy for your organisation?

Read more
  • Published in Data, marketing
No Comments

Nudge Theory

Monday, 02 August 2021 by Tonic Digital

Choosing an apple or banana when paying for petrol because the fruit is where we pay or snacks in a vending machine replaced by healthy choices are examples where customers are nudged in their decision-making process.

Read more
  • Published in Data, marketing, Psychology
No Comments

Chess vs Checkers

Tuesday, 27 July 2021 by Tonic Digital

Are you more of a chess or checkers (draughts) player when it comes to planning and implementing a strategy in your organisation?  These boards games generally played amongst friends, unless you play chess competitively, can assist us in reflecting and thinking about our style of leadership and implementing strategies.

Read more
  • Published in Data, marketing
No Comments

Customer Life-Time Value

Thursday, 22 July 2021 by Tonic Digital

We give lip service to the importance of customers and clients.  For businesses impacted negatively by COVID, whether we are B2B or B2C, maintaining, and indeed growing a solid base of loyal clients will be the difference between success and failure, there it is crucial to know the value of our most important asset.

Read more
  • Published in Data, marketing
No Comments

Where to hang your NFT?

Wednesday, 14 April 2021 by Tonic Digital

Finding hanging space for your recently purchased NFT artwork is not something you need to worry about, for your artwork will in all likelihood never need to be hung.

Read more
  • Published in Data, News, Workplace
No Comments

5 stages of data analytics

Friday, 13 March 2020 by Tonic Digital
Hands typing on a keyboard

Data analytics is the process of examining data sets in order to analyse and draw conclusions from historical outcomes, increasingly this is done through the aid of software such as dashboards. 

There are five stages of data analytics which we will explore in this article.

Read more
  • Published in Data
No Comments

Decision-making traps using data

Friday, 22 November 2019 by Tonic Digital
Bustling crosswalk in Tokyo

KEY TAKEOUTS

DECISION-MAKING TRAPS USING DATA

The nature of descriptive analytics exposes an array of decision-making pitfalls for decision-makers. By developing an awareness around common cognitive traps, we can use our understanding of this process to make more balanced and meaningful decisions.

In this article, we reflect on the individual traps in the data decision-making process, learn how to frame a problem so that we can use insights to make better decisions, and understand these golden rules around decision-making pitfalls:

  • The way we make decisions is far from the rational model
  • Decision biases affect every step of the decision-making process
  • Analytics can offer multiple tools to overcome biases and decision traps

Decision trap 1: Availability bias

Availability (bias) is a heuristic; whereby people make judgments about the likelihood of an event based on how easily an example, instance, or case comes to mind. We all experience availability bias in everyday dialogue, but especially in marketing when we hear; “but, you know it’s much harder to get data about that”, and this is often true. This emphases that it must be taken into consideration that there is an awareness and acknowledgement that the data being used may not necessarily answer your question.

It depends on how important the question that you’re trying to answer is; if it’s a high-stakes decision, you should be creative about collecting various sources of data to avoid this trap.

Decision trap 2: Short-term Emotions

Every decision we make is influenced by our emotions, and when high pressure situations are present, it’s very easy for our decision-making judgment to get clouded by these emotions, especially in group environments.

This obvious trap simply emphasises the importance of taking a moment to assess the external view. By allowing some distance from short-term pressures and emotions, we can assess the data without having our judgement being clouded and distracted. These windows create an opportunity for emotional intelligence (EQ) to flourish, where awareness and mindfulness will act as a key deterrent to this trap.

There is room for both emotions and data, particularly in creative environments. However, it’s often the case that the former can outweigh the latter when the pressure is very high, and that’s precisely what we need to avoid.

Decision trap 3: Confirmation Bias

Confirmation bias is the tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs. This biased approach to decision-making is largely unintentional and often results in ignoring inconsistent information.

We’re all very likely to fall into confirmation bias for the simple reason that it makes us feel intelligent. When we pursue a risky avenue, we seek people who will give us positive feedback and encourage us which is precisely what we want in terms of building our confidence. At the same time this should not be interpreted as objective evidence. And it’s not necessarily what we need to make a good decision.

What we actually need is the disconfirming evidence, which is evidence that refutes an opinion or forecast. Interestingly disconfirming evidence is regularly overlooked in data analysis and widely-considered to be the most underused.

Decision trap 4: Overconfidence

Overconfidence is ever pervasive in communications, and is arguably a requirement in this field to validate pitches or concepts. However, it’s a pitfall in the data science world, even for people whose jobs are to look at data and be objective. While we don’t want people who aren’t hopeful, as we would never have those breakthrough findings, at the same time, we need to stay objective and understand when something works and when something doesn’t. Therefore, we can see precisely why being able to use data in the right way is so critical.

Decision trap 5: Narrow Framing

Psychologically, the framing effect is a cognitive bias where people decide on options based on whether the options are presented with positive or negative connotations. When we use analytics, we can clearly define the objectives and the boundaries of those projects from the start. And this is not just important for our own understanding, it’s also critical for our peers so everybody can build a common consensus about what is it that we’re trying to achieve, how we measure success and what might be some of the limitations of our analysis.

When we suffer from a narrow frame effect, big data and advanced analytics can help us widen our frame because we can explore directions or hypothesis that normally we wouldn’t be able to simply because we wouldn’t have the opportunity to collect this data and infer patterns from them.

Summary

The general consensus indicates that the main obstacle for good decision-making is that we’re unaware of our own biases. If we think about common behaviour in the workplace, such as seeking positive feedback and recalling this only, we can see how our thinking can be susceptible to all these traps. However, as we know, avoiding negative feedback equals missing important insight for continued learning and development, a sentiment that is echoed greatly in data analysis.

Read more
  • Published in Data, Psychology
No Comments
  • 6
  • 7
  • 8

Recent Posts

  • The Rise of the Answer Economy: What It Means for Business and How to Prepare

    Generative AI platforms like ChatGPT and Perple...
  • Merging for Mission: Five Critical Considerations for Non-Profit Organisations

    Non-profit mergers are becoming increasingly co...
  • The End of Google Call-Only Ads

    Google's decision to end call-only ads represen...
  • Psychographic Segmentation for Deeper Connection

    A psychographic profile aims to reveal what peo...
  • The Power of Mobile Apps Gamification

    A mobile app gamification strategy involves int...

Recent Comments

    Archives

    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • July 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    • July 2022
    • June 2022
    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • March 2020
    • November 2019
    • July 2019
    • June 2019
    • May 2019

    Categories

    • Apps
    • Data
    • Design
    • ethics
    • Google
    • marketing
    • Mobile
    • News
    • NFT
    • Not-For-Profit
    • Psychology
    • SEO
    • Wellbeing
    • Workplace

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    OFFICES

    SINGAPORE

    63 Robinson Road, Afro-Asia
    Level 8, 068894

    PERTH

    37 St Georges Tce
    Level 13, 6000

    SOCIAL

    MENU

    ABOUT
    PROJECTS
    INSIGHTS
    CONTACT

    CONTACT

    © 2016 - 2025 | All Rights Reserved
    Privacy & Disclaimer

    TOP