As we enter into a new decade, it's hard not to look back and reflect on how different everything is now. Twenty years ago, the world was a completely different place than it is today.
According to the 2019 IDC study of spending on Artificial Intelligence (AI), it's estimated to reach $35.8 billion in 2019 and is expected to double by 2022 to $ 79.2 billion representing an annual growth rate of 38% for the period 2018-2022. The economic benefits and utility of AI technologies are clear and compelling. No doubt, applications of AI may address some of the most vexing social challenges such as health, the environment, economic empowerment, education, and infrastructure. At the same time, as AI technologies become more pervasive, they may be misused and, in the absence of increased transparency and proactive disclosures, create ethical and legal gaps. Increased regulation may be the only way to address such gaps.
Making an ECM implementation successful requires planning and attention to detail. The best way to create the right solution is to identify organizational goals and priorities. Learn how to manage a successful implementation in our free guide.
One of the most vexing problems for organizations is mitigating GDPR compliance risks when dealing with third parties, particularly the nature and extent of obligations between data controllers and processors. By virtue of the GDPR accountability principle, organizations are required to adhere to the six fundamental principles of safeguarding privacy rights that impact the collection, processing and disposition of personally identifiable information. These obligations extend beyond the walls of an organization to third parties that process personally identifiable information. Also, GDPR provides for a broad definition of processing and imposes stringent requirements on organizations that engage third parties to process personally identifiable information.
A potentially problematic challenge for industry and legislators is the apparent tension between privacy rights and the rapid adoption of blockchain-based applications which are expected to reach $10.6 billion in revenue by 2023.
Data Privacy Day takes place annually on January 28th in recognition of the January 28, 1981 signing of Convention 108, the first legally binding international treaty concerning privacy and data protection. This day, led officially by The National Cyber Security Alliance (NCSA), is an international effort to “create awareness about the importance of respecting privacy, safeguarding data, and enabling trust”.
More rigorous privacy regulations such as the EU GDPR and a number of US privacy initiatives such as the recently ratified California Consumer Privacy Act impose higher standards on data controllers and processors to safeguard privacy rights – including data subject consent management, accommodating data subject requests, data portability and more onerous data controller and processor accountability standards.