Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

TripleBlind Platform Analyzed By Polsinelli PC

Organizations Can Collaborate and Commercialize the Estimated 93% of Data Currently Unavailable, While Avoiding Violations of GDPR, HIPAA and Other Regulatory Standards

TripleBlind’s privacy claims relating to its Enterprise Data Privacy as a Service technology, which unlocks new revenue opportunities while automatically enforcing all privacy regulations, have been analyzed by Polsinelli PC, which found that data de-identified using TripleBlind’s one-way encryption and distributed computing can reduce the legal risk for all parties involved in data exchange processes.

Recommended AI News: Telia Carrier Adds New Connectivity Options In Hillsboro, OR “Silicon Forest” Region

Data sharing via TripleBlind’s technology includes three roles: a data provider, an algorithm provider (data user) and TripleBlind. In the scenario below, Hospital B (the algorithm provider and data user) wants to know if it shares any patients with Hospital A. Hospital A agrees to share this information with Hospital B, and TripleBlind enables the hospitals to collaborate and provide better patient care while never decrypting data:

  • Hospital A (data provider) and Hospital B (data user) share their databases to TripleBlind software locally (behind their firewalls) and TripleBlind’s technology encrypts each of Hospital A’s and Hospital B’s data (still behind their respective firewalls).
  • Hospital B requests patient information to determine if it shares patients with Hospital A.
  • TripleBlind’s technology enables a comparison between Hospital A and Hospital B data without decrypting it, forwards patients in common (and no other information) to Hospital B.
  • TripleBlind’s technology prevents Hospital B from performing any other operation on the data forwarded to it.

“Gartner anticipates that by 2023, 65% of the world’s population will have their data protected under privacy regulations. As more governments introduce privacy laws, it’s imperative for organizations to understand and comply with these different regulations,” said Polsinelli attorney and shareholder, Elizabeth Harding. “We analyzed TripleBlind’s claims that its technology has privacy-enhancing features that enable compliance with data privacy laws including GDPR and HIPAA.”

Polsinelli found that use of TripleBlind’s technology to permanently and irrevocably de-identify data:

Recommended AI News: RackWare Adds Industry Veteran Marvin Blough To Lead Next Phase Of Growth

  • Reduces exposure under GDPR by taking the data processed outside the definition of personal data; and
  • Reduces exposure under HIPAA by ensuring that data is processed in a de-identified fashion.
Related Posts
1 of 27,775

“TripleBlind’s breakthrough solution in cryptography and data privacy allows for safer and more compliant collaboration globally,” said Riddhiman Das, co-founder and CEO of TripleBlind. “We will see this become increasingly important as areas around the world continue to implement their own conflicting data privacy laws, similar to what we are seeing in the United States in states like California and Virginia.”

GDPR defines personal data as “any information relating to an identified or identifiable natural person,” a broad definition that can lead to confusion and privacy risks. GDPR applies to the processing and sharing of personal data; it does not apply to anonymous data. TripleBlind allows entities to share anonymous, permanently and irrevocably, de-identified data while still achieving the same data-sharing needs, eliminating the risk of organizations violating GDPR.

In the report, Polsinelli walks through a use case where TripleBlind uses anonymized random pieces to allow an e-commerce website to detect fraud on its website using a cloud-based service provider. Traditionally, the e-commerce site would provide raw data directly to the service provider, and the provider would run its algorithm against the raw data, which includes personal data. This would trigger various GDPR obligations, including the requirement for a lawful basis for processing and, to the extent the personal data in question was considered a ‘special category’ of personal data, an exception to the prohibition on processing such special category personal data. Instead, TripleBlind’s technology obtains the same outcome without the Vendor processing identifiable data.

As it relates to GDPR regulations, Polsinelli finds that data which is permanently and irrevocably anonymized using TripleBlind’s technology is exposed to minimized privacy risks from the perspective of each of the roles involved in a typical use case:

  • The data provider reduces its obligations under the GDPR by taking steps to minimize the sharing of personal data with third parties, including algorithm providers.
  • The algorithm provider can altogether avoid the GDPR by performing its functions through TripleBlind (it does not process personal data).
  • The GDPR does not apply to TripleBlind in its role as the technology vendor because TripleBlind does not process personal data.

In the United States, HIPAA regulates the use and disclosure of Protected Health Information (PHI). Through a combination of one-way cryptography and data splitting, TripleBlind mitigates user risk of violating HIPAA by allowing entities the ability to collaborate without using or disclosing PHI.

TripleBlind allows healthcare providers to achieve more ethical and equitable AI-based models and deliver “deep medicine” globally with simpler compliance with local and regional privacy standards.

Expert Dr. Kalikinkar Mandal conducted a de-identification analysis of TripleBlind under the Expert Determination Method and concluded that the risk of the algorithm
provider identifying an individual based on PHI is very low with respect to the data processed through TripleBlind’s technology.

As it relates to HIPAA regulations, Polsinelli finds that data which is permanently and irrevocably anonymized using TripleBlind’s technology is exposed to minimized privacy risks from the perspective of each of the roles involved in a typical use case:

  • Information from the data provider is immediately encrypted and split in a manner that results in no data elements which would be considered PHI being disclosed to the recipient.
  • The encrypted result does not enable the algorithm provider to reverse engineer the original data input, so the result data also does not contain any data elements which would be considered PHI.
  • TripleBlind never processes any PHI, whether encrypted or otherwise, meaning TripleBlind is not a business associate of either the data provider or the algorithm provider.

Comments are closed.