Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Security in the Sharing Economy

The collaborative, gig, peer-to-peer or Sharing Economy.  Whatever your preferred term, these platforms have taken off around the world: the European Commission estimated 2015 gross revenue in the EU from platforms and providers to be €28 billion[1], Pew Research reported in 2016 that 72% of American adults have used a sharing economy or on-demand service, and professor Arun Sundararajan stresses the immense untapped market potential in India and China. But with such success comes inevitable friction, namely concerns about government regulation, precarious employment, racial discrimination, and whether the “sharing economy” really lives up to its name in terms of profit distribution.  To deliver value as a sharing economy platform in 2017 means navigating these issues successfully – by no means an easy task.

The majority of Americans have participated in the sharing or on-demand economy. Pew Research Center, Shared, Collaborative and On Demand: The New Digital Economy, May 19, 2016

As a Communications Platform as a Service provider, our place in helping these platforms grow is mainly in two areas – matchmaking between buyers and sellers, and, crucially, in facilitating security.  As we’ve watched the market evolve, we’re eager to weigh in on how sharing economy platforms can add value to users, especially with the help of SMS and voice.

What does it take to build a successful sharing economy platform?

A 2016 American Federal Trade Commission Report outlines the widely accepted key elements that define successful sharing economy platforms:

  1. They are “successful if they are liquid”;
  2. if they enable matchmaking between buyers and sellers in real time; and
  3. “if the transactions in them are safe.”[2]

Point one means that the platform needs to offer “thick” marketplace environments – enough buyers and sellers to warrant using the platform. Providers can use various methods to build up both sides, like temporarily adjusting prices to incentivize buyers and or sellers and then readjusting when enough users are attracted.[3]

Points 2 and 3 are worth exploring in more detail.

Why SMS can be a good choice for real time matchmaking

Sharing economy platforms need to match buyers and sellers in real time to make themselves more attractive than hunting for a traditional retailer or service on Google.  In other words, they have to be convenient. Timely notifications are crucial – just try and think of using Uber, TaskRabbit or Airbnb without them.  What channel(s) you choose will depend on your offering, how you’ve built your platform and your stage of development, but we’re generally proponents of using SMS in a SaaS context. The same reasoning applies for sharing economy platforms:

SMS are immediate

It’s a statistic you’ve probably heard before – SMS have an average open rate of 98%.  Though email has its own strengths, they do often stagnate unopened in inboxes, making them less than ideal for time-sensitive notifications: Hubspot reported in 2017 that the overall average open rate across all industries for sales emails is 32%.  IBM Marketing Cloud’s 2016 Email Marketing Metrics Benchmark Study reports similar findings: overall mean open rates of 21.8%.

At the least, SMS can be a good compliment to email – one of our clients who operates an online platform for buying and selling used clothing sends an SMS when an offer is made to sellers who aren’t connected to the platform or who haven’t read the email notification.  This way, they can be sure the offer is seen.

Finally, SMS also have encouraging response rates: 90 seconds is how long it takes on average for a text, compared to 90 minutes for an email.  For a platform that needs to deliver real time matchmaking for buyers and sellers, this immediacy can be attractive.

They’re ubiquitous

Though smartphone ownership is increasing worldwide – and with it, easy access to emails and apps – it is by no means universal, especially outside of the U.S. and Europe. SMS have the advantage of being deliverable to a wider range of mobile phones. 

Though smartphone ownership throughout the world is increasing, it is by no means universal. Pew Research Center, Smartphone Ownership and Internet Usage Continues to Climb in Emerging Economies, February 22, 2016

They (can be) safe

Communications Platform as a Service providers can assign temporary or anonymous numbers to platform users for SMS or voice interactions.  This can help providers assure that all communication stays on the platform, which is key to many offerings for insurance and safety reasons, and to avoid “platform leakage.”

This brings us to our last point…

Building safety into sharing economy platforms 

“But the real magic and the secret source behind collaborative consumption marketplaces like Airbnb isn’t the inventory or the money. It’s using the power of technology to build trust between strangers.” – Rachel Botsman (3:06)

Success point 3 – a sharing economy platform works, “if the transactions in them are safe.”

Safety is arguably the crux of the debate surrounding the sharing economy, and one of the most difficult to resolve. Opinions differ as to exactly how to ensure the safety of buyers as well as sellers, or in fact or who’s responsible, as a Pew Research survey indicates below.  Governments tend to take a stronger stance, with a June 2016 European Commission agenda stating that, “…the Commission does call on the [tourism] platforms to take voluntary action to fight illegal content online and to increase trust.”[4]  It’s certainly in the interest of platform providers to decrease the real (and perceived) risk involved in using their service, and there are several ways this can be done.

Though safety is a key concern in the sharing economy, the public isn’t unanimous on who’s responsible for what. Pew Research Center, Shared, Collaborative and On Demand: The New Digital Economy, May 19, 2016

Rating systems – good but not perfect

If your providers are individuals, the best way to provide value is to focus on trust and safety. People are hesitant to trust strangers. If I am renting my car or power drill to a stranger, how can I be sure they will not steal it, trash it or break it? This distrust creates lots of friction in C2C marketplaces. The marketplace itself can reduce friction by acting as a trusted middleman.

 Juho Makkonen, January 5, 2016

One critical mechanism that has come to be a cornerstone of the sharing economy is reputation rating systems.  Although they do decrease “information asymmetry,” as the FTC report calls it, there are also certain limitations:

Reputation rating system limitations:

  1. Ratings Biased Upward and Toward Extreme Experiences: Users are more prone to leave a review if their experience was particularly good or poor, and many often don’t leave any review if it was negative (to avoid potential confrontation). This can skew results.
  2. Ratings Can Be Manipulated for Strategic Purposes: Fake reviews, whether positive or negative, abound, and some people shy away from posting negative feedback for fear of repercussions or of violating social etiquette.
  3. Impact of Experience: Newer users who aren’t used to how reviews tend to be written aren’t as adept at interpreting them as more experienced users.
  4. Cold Starts as a Problem for New Entrants: Users who are completely new to the platform necessarily have no positive (or negative) reviews, making it difficult for them to build a reputation and get the ball rolling.
  5. Reputation Milking and the Final Period Problem: People who know they will be leaving the platform or who want to play off their good reputation score may feel they can benefit from these two situations by doing what they like without reprisal. [5]

All in all, reputation systems certainly help build safer sharing economy platforms, but they shouldn’t be used as a sole mechanism for trust buildingAs the FTC report states, “Although panelists generally agreed that reputation ratings systems are working well in the sharing economy, many expressed the view that these systems do not function perfectly.”[6]

Platform intervention – also good, but also not perfect

If you have a rental marketplace, you can offer insurance. If an item is stolen or broken, you cover it—but only if the payment was conducted through your marketplace’s payment system. Peer-to-peer carsharing marketplace RelayRides prides on offering an insurance of up to $1 million, and it covers not only the damage to the car, but also potential claims from third parties for damage or injuries. KitSplit focuses on low-value items (cameras and other creative equipment), so in their case lower coverage of up to $10,000 is enough. Unlike most of its competitors, BlaBlaCar charges a commission, and justifies it with ridesharing insurance.”

 Juho Makkonen, January 5, 2016

Many platforms also recognize the need to intervene directly in a variety of ways to ensure safety and build trust in the community. This can be called “platform intervention.”

Platform intervention:

  1. Curate entry into platform: providers can select who can begin using the platform based on certain criteria – for a ride-hailing app, this might mean those who have valid driver’s licences.
  2. Reimbursement options: useful for service-providers to reassure unhappy customers
  3. Insurance through platform: can help home sharing platforms be perceived as more secure, for example. [7]

However, platform curation can never filter out all bad apples, and reimbursement and insurance options won’t prevent problems, just help after the fact.  Reputations rating and platform intervention systems cannot provide totally secure sharing economy experiences.

Building the right mix of transparency and privacy with voice and text messaging

Now, with all of my optimism, and I am an optimist, comes a healthy dose of caution, or rather, an urgent need to address some pressing, complex questions. How to ensure our digital identities reflect our real world identities? Do we want them to be the same? How do we mimic the way trust is built face-to-face online? How do we stop people who’ve behaved badly in one community doing so under a different guise?

Rachel Botsman, (TED talk) September, 2012

A crucial decision when building these systems is to what degree users’ “real” contact information telephone number, email address, social network profile or even full name – should be made public.  Some advocate for “total transparency,” arguing that this acts as a self-regulating mechanism.  The reasoning goes that if everyone’s equally exposed, we’re all likely to act reasonably, and we’re all equally vulnerable if we don’t. Indeed, in some instances, it’s preferable to know users’ real identities; integration with social networks like Facebook, or requiring a university email address, can reassure users about who they’re interacting with, or keep platforms within a closed community.

Airbnb allows users to sign in via Facebook, linking users’ “real” identities to their Airbnb profiles. This can be reassuring for users, especially if they see they have contacts in common.

However, there are also many opportunities for people to take advantage of unprotected contact information:  Scammers and spammers look for phone numbers readily available on websites for phishing campaigns, and research has well documented the tendency to act in disinhibited ways online,[8] which can lead to cases of harassment or other unsavoury episodes.  The issue is more one of balance. As a 2015 Future of Privacy Forum report explains,

“…[sharing economy] platforms must understand the relationship among identity, anonymity and obscurity, and reputation that will facilitate user trust… some of the steps needed for users to build and maintain their reputation on a sharing economy platform can create privacy challenges. Platforms need to offer a degree of transparency in how users can access their information. They also can offer users obscurity vis-à-vis other users to the extent possible to enhance privacy.”[9]

In other words, in some scenarios, it’s preferable for a buyer or seller to know each other’s real identities, and in others, it’s best to hide them.  CPaaS providers are well-placed to help sharing economy platforms tackle the challenge of obscurity vs identity.

Anonymous numbers and phone validation

One way this can be done is by providing anonymous or dedicated numbers for text messages or calls through API integration (provided you’ve built your platform from scratch or are using an open source solution). This keeps notifications timely and convenient, while still allowing a degree of identity obscurity for users and control for the platforms. For example, imagine that you’re using a sharing economy app to hitch a ride to a party.  You’ve checked your driver’s rating and trust the platform to make sure they have a valid licence – but you still might not want them to have your real phone number. Keeping users’ real numbers obscured provides a complementary layer of security to reputation rating and platform intervention systems, reducing the perception of risk for users.  Incidentally, blocking the exchange of real contact info can also help avoid “platform leakage,” i.e. buyers and sellers side-stepping the platform (and their commission fee) to complete their own transactions once they’ve found each other.

Want to add anonymous numbers to your platform?

Many platforms do indeed insist all communication goes through their channels for these reasons.  As TaskRabbit explains on their FAQ page, “…Taskers never receive your phone number or email since all communication goes through the TaskRabbit platform. Our in-app call and chat features, which utilizes a third-party system, allow Taskers to chat with you and call you directly. For this reason, there is no need to exchange personal contact information.” Uber likewise uses a messaging API for similar reasons: “For safety purposes Uber uses a number randomization software so that riders and drivers never actually have each other’s personal phone number.”

Conversely, instead of masking real phone numbers to maintain privacy or avoid platform leakage, you can also use text messaging to check if a user is who they say they are.  Cristóbal Gracia of ShareTribe explains,

“The next most common step is phone number validation. Typically, this is done by sending an SMS with a code to the phone number provided by the user, which then needs to be entered into the service to verify that the user owns the number. eBay uses a system that automatically calls users’ phone numbers to verify their identity.”

In this case, automated SMS work as an identity check, making sure your new platform user actually is who they say they are.  It’s preferable in many situations for platforms themselves to be able to check buyer and seller’s real identities, in case of problems.  A last advantage of passing through a CPaaS provider is the option of accessing recordings or reports of SMS or voice activity in case of legal action – in a harassment case, for example.

Key takeaways

Providing safe interactions is essential for building or maintaining a successful sharing economy platform, and rating systems and platform interventions can help.  However, they by no means eliminate all the associated risks or apprehensions.  SMS and voice API integration, in addition to providing timely notifications for real time matchmaking, can add an extra layer of control and privacy for platforms to help offset the limitations of other safety mechanisms. 

[1] European Parliament Briefing, Tourism and the sharing economy, pg. 4. January 2017
[2] Federal Trade Commission Staff Report, The “Sharing” Economy Issues Facing Platforms, Participants & Regulators, pg. 10. November 2016
[3] Ibid., 20.
[4] European Parliament Briefing, Tourism and the sharing economy, pg. 8. January 2017
[5] Federal Trade Commission Staff Report, The “Sharing” Economy Issues Facing Platforms, Participants & Regulators, pg. 40-46. November 2016
[6] Ibid., 5-7.
[7] Ibid., 47-50.
[8] John R. Suler, “The Disinhibited Self,” in Psychology of the Digital Age: Humans Become Electric, ed. John D. Kelly et al. (Cambridge: Cambridge University Press, 2016), 95.
[9] Future of Privacy Forum, User Reputation: Building Trust and Addressing Privacy Issues in the Sharing Economy, pg. 3, June 2015

The post Security in the Sharing Economy appeared first on CALLR Blog.



This post first appeared on CALLR's, please read the originial post: here

Share the post

Security in the Sharing Economy

×

Subscribe to Callr's

Get updates delivered right to your inbox!

Thank you for your subscription

×