Sponsored Content

How Intentional Innovation Can Build Trust in Tech

By Paula Goldman, Chief Ethical and Humane Use Officer at Salesforce

Last year, Americans’ trust in tech companies hit an all time low. Just 54% trust tech companies to do the right thingdown nearly 20 points since 2019.

This comes as the digital economy hits an all time high, driven by the pandemic and the changes it brought to how many people do business. Salesforce research shows that 57% of consumers today prefer interacting with businesses online to offline. In fact, nearly 70% of customers say they’ll continue to buy the majority of their goods online even after the pandemic. 

In today’s economy, trust is more than a value—it’s a critical competitive advantage. One that can make or break everything from customer relationships to a businesses’ bottom line. 

The Trust Imperative

Trust may feel like something hard to measure and even harder to deliver. But really, it’s just about giving people the same kind of experience you’d want: one where you feel like a valued stakeholder—not one data point in a billion. 

A lot of people (myself included) are used to businesses knowing enough about us to personalize their services. But when you start getting offers on treatments for medical conditions you haven’t disclosed to the company in question, for example, a personalized experience can quickly become a problematic one.

Many companies and developers have historically relied on “dark patterns,” or choices in design that can trick people into making (or not making) choices that are beneficial for the business but not in the best interest of the user. Think of every time the “agree” button is bigger and brighter than the “do not consent” button in app agreements. Every day, these defaults lead users to unknowingly agree to be targeted or give up their privacy—simply by clicking a highlighted button.

But what if technology defaulted to trust instead of tricks? What would it look like if technology was embedded with features that prioritized things like privacy, inclusion, and ethics?

Intentional by Default

In my experience, trusted technology is built with intention. It starts by looking at the real-world impact of a product or service, and then crafting a strategy that makes it easy and automatic to get the results that customers expect and deserve. 

Today, it can often feel like the burden of responsibility is placed on consumers—having to pay extra attention or learn to recognize deceptive strategies in the products and platforms they use. But I believe there’s a more structural solution, one that begins and ends with the businesses who design these technologies.

One way companies can do this is through what we call Intentional Defaults. Most software products and services come with a default, out of the box setting. We work with our designers, product managers, and engineers to ensure that Salesforce features are set to an ethical and inclusive option by default. 

Consider for example whether address should be a mandatory field on a vaccine intake form—something we considered when building Vaccine Cloud. We realized that asking for people’s addresses could potentially exclude the unhoused from getting vaccinated—so by default, we set address to optional. It’s these types of small changes that build trust and help people connect with companies in ways that feel positive, not pushy. 

Intentional Innovation

Working toward more intentional innovation in the tech industry is one of my team’s main focus areas. We guide the responsible design, development and use of our technologies—and to enable the thousands of companies who use our products to do so ethically.

So, what does it look like when real-world products use Intentional Defaults?

A recent example of this is Salesforce Genie—a new solution that manages real-time data across our Customer 360 platform. We knew Genie would be a powerful source of insight but we also understood that its real power would come from customer relationships, not just aggregated data. My office has worked with the product team for over two years to help create guardrails and features that enable users to implement data ethics best practices, making it easy to do things like:

  • Target customers based on interests, rather than demographics
  • Keep sensitive info separate from more general data
  • Give customers control over how their data is collected and used

Our team is also very focused on ethical AI and how Intentional Defaults can make these transformative technologies more inclusive. We’ve built sensitive field warnings into AI products like Einstein, cautioning customers against collecting information that could be potentially biased, and letting them know that things like ZIP codes and home addresses can serve as proxies for race and income in the United States. 

From guidance to in-app guardrails to simply setting the defaults intentionally, we’re working to disrupt the processes that create dark patterns and replace them with ones that prioritize trust.

In Closing

The digital economy is here, and the race to win consumer trust is already running full-throttle. 

The good news is our society has been here before. More than a decade ago, consumers were pushing the security industry to build products with safety and privacy in mind. And decades before that, consumers demanded more safety from the auto industry and seat belts became not only a new standard—but law.

Businesses have learned to build intentional, trusted products by working together with impacted communities, civil society and government. If we’re willing, I know the tech industry can do it again.