Blog | HR |
8 Min Read
July 5, 2024

4 Disadvantages of AI in HR Management

by Nina Santos

Artificial Intelligence (AI) solutions have the potential to revolutionize critical challenges in healthcare, addressing workforce shortages, reducing costs, and enhancing the quality of patient care. From a technology standpoint, the benefits and capabilities are attractive. Yet, AI also has some disadvantages or weaknesses for healthcare HR.  


Managing and interpreting people analytics requires a delicate and empathetic hand. AI can bring objectivity and efficiency to HR tasks, but it also carries risks that can impact your workplace culture, hiring practices, and the use of AI itself. 


AI requires substantial training

The Theory

AI technology is not a new phenomenon. Rather, it is the term for a group of technologies that have been in play for many years, including the healthcare industry. These technologies range from machine learning, natural language and rule-based systems, automation, robotics, and the recent generative AI or large language model.  


One of the key features of AI is that the technology can learn to perform tasks autonomously by making predictions from statistics, interpreting language, and following if-then rules and workflows. The technology can take historical information and learn from past insights to complete new tasks. However, AI models need training to do these tasks and do them well. 


Training takes large volumes of data, especially if you want the technology to complete tasks reliably. It doesn’t mean building AI tech from scratch. You’ll likely use AI HR software or systems with a baseline function, but those solutions still need custom training to carry out tasks specific to your organization. 

In Practice

For instance, if you plan to use AI for healthcare HR tasks like attracting, recruiting, and engaging new talent, you need to instruct the system on what to prioritize. This involves establishing criteria for essential skills, traits, experience, education, and licensing, developing assessments, and prioritizing applications to train the system to identify and rank your preferences. 


Then, you must teach the software when to move a candidate forward in the application process. It takes time to set the rules and workflow and continual oversight and auditing to verify that the candidates the software selects are the right choice.  


Potential for bias

The Theory

AI HR tools often purport to reduce bias and increase fairness in the hiring process and HR management. These tools reduce the amount of human input and rely on an objective set of parameters to interpret data and make decisions, eliminating much of the unreliable gut feelings, intuition, or experiences that often arise out of unconscious bias. 


However, AI tools can develop biases and discriminate against candidates or applicants. How? Through a series of mechanisms, but mainly that AI technology doesn’t account for or recognize human complexity. AI is a technology driven by parameters and mathematical logic and doesn’t necessarily consider the qualitative characteristics of individuals, like attitudes or beliefs, or the context in which these occur. 


In other words, AI may analyze candidates but take characteristics out of context and reduce people to numbers. It might not consider a person as a whole or see the value they might bring in personality, connected, or other indefinable traits that make humans, well, human.   


In addition, HR managers must input parameters, ask questions, interpret data, and draw conclusions, but humans aren’t perfect. These technical and data science skills may be outside the typical HR manager skillset, which may skew judgments. The result is an injection of unintentional bias into the training, and these occurrences have consequences.  

In Practice

In 2023, the Equal Employment Opportunity Commission (EEOC) settled its first lawsuit alleging age discrimination against an AI HR software company. Many long-term care organizations use this platform, making it a key moment in healthcare HR.  


According to the consent decree, the company programmed its software to auto-reject female applicants aged 55 years and male applicants aged 60 years. The system automatically rejected over 200 older applicants, leading to substantial bias. The result was a $365,000 settlement.  


Paying attention to biases is essential. Examples of outright bias might be setting guidelines that automatically exclude people by age, race, gender, or ethnicity.  


Unintentional discrimination might also happen when you set experience or other career parameters. People with a higher chance of having a non-linear career path or less opportunity for promotion may automatically be eliminated from the candidate selection.   


Concerns about data privacy and security

The Theory

AI HR solutions need to consume and analyze vast amounts of data about people to view trends, predict outcomes, and complete tasks. Some models use publicly available data on the internet for training. Other forms use only internal data or data submitted during an application process.  


However, these practices can quickly become an ethical and privacy rights issue. Without explicit notice and consent for the use and storage of personal data for AI purposes, you might collect data without considering privacy or transparency.  


In addition, the security of personal or sensitive data is another concern. Hacking, cyberattacks, and data breaches are now a common occurrence across every industry, but AI technologies are the most vulnerable.  


Where traditional software becomes compromised from malicious code or code errors, AI becomes compromised through the model itself, which is unfixable. An attacker can manipulate the model to perform as requested by giving it prompts or tweaking the data the model sees or reads.  

In Practice

For example, chatbots are well-known to be some of the most insecure forms of AI. In 2023, a group of cybersecurity professionals competed in a contest during a conference to convince a chatbot to reveal secure information. One competitor got a chatbot to reveal credit card information by simply prompting it to connect a name with a card.  


At the same time, many organizations using AI are already experiencing security problems. A 2024 report found that 77% of businesses have already had AI security breaches and are increasing security budgets to address these problems.  


The government also recognizes the gravity of cybersecurity issues and recently announced a new program through the Advanced Research Projects Agency for Health (ARPA-H). The program will spend $50 million to develop new tools for healthcare facilities that will help fight security concerns.  


Potential for loss of connection with talent

The Theory

One worry about AI HR solutions and healthcare HR software is the risk of disconnection, which can lead to oversimplification and biased performance evaluations. For example, HR software designed to enhance efficiency and streamline workflows may track emails, response times, and time dedicated to charting or training to optimize time management. However, these tools analyze behavior but only from a reports standpoint.  


Data-driven insights might produce a faulty picture of your talent, reducing them to numbers and perceived performance but not representing their actual work. A person who knows how to game the numbers can appear to be exceptionally productive yet produce poor care outcomes or other concerns and vice versa. These disconnections from your talent can lead to promoting or dismissing the wrong staff and creating unfairness within the workplace culture.  

In Practice

The disconnections are making waves within the healthcare industry, causing concerns between nurses and employers. In May 2024, the National Nurses United union called for a pause on AI technology in healthcare because they don’t trust employers to implement AI with patient safety in mind.  


According to the union survey, half of employers use AI to assess patient acuity and predict staffing requirements, but 69% of nurses said their assessment doesn’t align with the AI. These inconsistencies between professional judgment and AI calculations and the history of employers depending on tech more than nurses has led to protests in a few locations.  


Relying heavily on AI may encourage unnecessary separation between you and your staff, leading to a lack of cultural awareness, discord, and distrust. In contrast, leveraging AI as a tool rather than a stand-in boss or HR manager can help you maximize technology without hindering relationships.  

Navigating AI in HR processes

AI isn’t perfect — it comes with weaknesses — but it also offers an incredible opportunity to transform work. Healthcare HR can become more efficient, cost-effective, and simplified, allowing teams to focus on people and relationships.  


The key is to implement AI with oversight. Assess parameters, questions, and guidelines for bias, and perform audits to confirm the software is performing as expected. Implement software and solutions with robust security standards and invest in upgrades and IT support. Understand the critical need for safeguarding digital information against theft, corruption, or unauthorized access, and partner with organizations that prioritize data security, such as Empeon.  


Finally, put people first, and focus on AI as a tool, not a cure-all for HR management. With a people-centric model, you can enhance relationships and build trust and engagement.  


AI HR tools, such as Empeon, help you simplify your process, giving you more time for your most valuable asset—people. Learn how streamline your workflow and power your department with Empeon. 

Related Post

How HCM Software Solutions Transform the Healthcare Industry

Long-term care providers and nursing homes are still feeling the effects of the pandemic — nursing homes still sit below their 2020 employment levels by over 14%. Front-line workers are still battling burnout while administrators scramble to navigate labor shortages. Human capital management software (HCM) helps to address these issues. This technology empowers nurses, streamlines… Continue reading How HCM Software Solutions Transform the Healthcare Industry

read more >

WeCare Centers Sees 7 Key Benefits with Empeon’s HR & Payroll Solution

WeCare Centers, a New York-based nursing home group, switched from contracting out its HR and payroll back office to building its own with Empeon’s cloud-based HCM solution. After two and a half years of success with Empeon, Estee Lermer, Corporate HR and Payroll Director at WeCare, shares seven key benefits the company has experienced since… Continue reading WeCare Centers Sees 7 Key Benefits with Empeon’s HR & Payroll Solution

read more >
Skip to content