Grant Recipients Announced for Spring 2022 Security, Privacy & Trust Research Funding Grant
May 7, 2022
Four research teams of Pamplin faculty members have been awarded a grant to help fund and support their research in areas related to security, privacy, and trust.
Shilpa Madan, Assistant Professor of Marketing at Pamplin
Dr. Madan and her research partner of Nanyang Technological University, Singapore are one of the four research teams that will receive a grant to help fund their research for their project, “A Deep Learning Approach to Hypothesis Generation: Identifying Novel Antecedents of Trust in Artificial Intelligence.”
Madan’s research will identify the underlying values and beliefs that predict trust in AI across the world, “Lack of trust is arguably one of the key obstacles preventing the deployment of artificial intelligence (AI) in everyday life,” Dr. Madan writes, “Citizens across the world remain divided on whether or not they trust AI. Industry evidence shows that there are significant differences among countries in the extent to which their citizens trust AI.”
Using this data, Madan trained a deep learning model to predict 72 countries’ trust in AI scores and found several theoretically insightful and practically relevant antecedents that shape people’s support for AI. Dr. Madan plans to use survey and experimental methods to provide evidence for the role of these antecedents in increasing support for and trust in AI-based systems.
Madan states that citizens’ trust in AI is essential for the widespread deployment of technologies that can help make people’s lives more convenient, safer, and efficient. To that end, by identifying the key values and beliefs that shape people’s trust in AI, her research will help further the ability to predict the trajectory of AI development and growth across the world.
-
Bio Item
-
Bio Item
Anthony Vance, Professor in Business Information Technology and Michelle R. Lowry, Assistant Professor in Accounting and Information Systems
Professors Anthony Vance and Michelle R. Lowry were awarded a grant for their research project, “Having a Seat at the Table: The Effect of CISO Integration with Top Leadership on Data Security Breaches.”
Lowry and Vance will investigate how the characteristics of chief information security officers (CISOs) and their level of interactions with the board of directors or the executive suite of a firm influence the likelihood of data security breach occurrences. “Since its establishment in the 1990s,” the team states, “the role of the chief information security officer (CISO) has emerged as critically important to organizations in managing cybersecurity risks. Recognizing the role of CISOs [in reducing] data security breaches on firms, regulators have issued guidance to encourage greater integration between CISOs and boards. Unfortunately, many firms fail to fully embrace the CISO as a legitimate member of the executive suite or collaborator of the board of directors.” The study will also examine how the backgrounds of CISOs and board of directors, as well as the extent of their interaction, effects this relationship.
Lowry and Vance plan to employ an archival approach, including text mining of CISO attributes from financial statements, to test their research model using a sample of data security breaches from 2006 to 2021. The team expects the results of their work will contribute to cybersecurity and IT governance research, as well as to practice, by examining the crucial role the CISO plays in interacting with management and how this interaction can mitigate data security breach related risks.
-
Bio Item
Gregory Kogan, Assistant Professor of Practice, Accounting and Information Systems
Professor Kogan and his research partner from University of Scranton, will analyze supply chain information security controls and provide a compare and contrast analysis of the NIST and the AICPA Trust Frameworks in their capacity and ability to provide countermeasures to reduce the risk of supply chain cyberattacks in their study titled, “Cyber Supply Chain Risk Mitigation through Internal Controls.”
Professor Kogan writes, “The NIST and the AICPA Trust Frameworks both have components that can help organizations to implement supply chain attack countermeasures.” The study will provide details on assurance processes that can be put in place to risk assess and monitor the controls around supply chain risks as well as how to report the results of such assurance processes to relevant internal and external stakeholders.
Kogan states that by implementing information security internal controls as related to supply chain cyber risk mitigation, organizations will not only reduce their risk of a data breach, but they can also have their internal controls audited. By creating an auditable cyber security environment, organizations can report on their cyber security posture to their stakeholders. Such reporting can create more trust with the organization’s stakeholders (investors, customers, and suppliers).
Kogan’s research will also help organizations to mitigate the cyber supply chain risk by providing a blueprint of internal controls. Additionally, when organizations must choose one framework, this research will help decipher the key differences between the NIST and the AICPA Trust frameworks as it relates to mitigating cyber supply chain risk.
-
Bio Item
-
Bio Item
-
Bio Item
Idris Adjerid, Associate Professor, Business Information Technology; Tabitha James, Professor, Business Information Technology; and Viswanath Venkatesh, Verizon Professor, Business Information Technology
Professors Idris Adjerid, Tabitha James, and Viswanath Venkatesh’s will use the grant they receive to support their study, “Complying with Our Algorithmic Overlords: How Power and Anthropomorphism Shape Resistance to Algorithmic Advice through Perceptions of Trust and Control.” The study examines how trust and control can be leveraged to encourage compliance with algorithmic recommendations. They will examine how the interaction of power and anthropomorphized algorithmic advice can affect resistance to algorithmic recommendations through algorithmic trust or distrust and illusory or actual control of the algorithm.
Using AI-Powered diet and nutrition apps for people suffering from chronic diseases, the research team will conduct two sets of experiments. The first will determine the role of trust or distrust in accepting algorithmic advice, and the second set of experiments will determine the role of illusionary or actual control in the acceptance of algorithmic advice. The team states that their results will help determine how algorithmic advice can be presented to increase perceptions of trust and control while also determining the conditions these were achieved in.
The team hopes that the results of their research will help encourage the use of AI to supplement an overwhelmed healthcare system by providing support for people suffering from chronic health conditions such as diabetes. They also hope their research will provide insight into the psychological barriers that need to be overcome to encourage the use of AI-powered assistance for health conditions while also providing valuable information for the design of such health apps to ease their acceptance.