The last time Chuck Blatt searched for a job, about 10 years ago, he relied on a thoughtful cover letter, a résumé printed on nice paper and good rapport during a face-to-face interview.

Now, he said, "that is all out the window."

Since Blatt, 50, left his job as vice president of a painting and construction company in March, he has spent nearly every day in front of the computer in his Chicago home applying for jobs via automated processes.

He uploads his job history with the click of a button. He records videos of himself answering automated interview questions. He takes the lengthy online personality tests employers use to screen candidates.

Blatt, who is seeking a marketing position, said technology makes it easier to apply for more jobs. But other parts of the high-tech hiring process leave him uneasy.

"I have been turned down for positions that I thought I would be perfect for," Blatt said, and it is often impossible to know why. "There is no feedback because there is no one to talk to."

Technology is transforming hiring, as employers inundated with applications turn to sophisticated tools to recruit and screen job candidates. Many companies save time with video interviews or résumé filters that scan for keywords, and those at the leading edge are using artificial intelligence in a variety of ways: chatbots that schedule interviews and answer applicant questions; web crawlers that scour mountains of data to find candidates who aren't actively job hunting; and algorithms that analyze existing employee data to predict an applicant's future success.

Advocates of AI-enhanced hiring claim it reduces turnover by bringing on candidates who are a better fit. They also say a data-driven approach removes bias inherent in human decisionmakers who, for example, might favor candidates who graduated from their alma mater.

But critics warn of the opposite effect: that some applicants could be unfairly weeded out.

Cathy O'Neil, a mathematician and author of the 2016 book "Weapons of Math Destruction," worries that algorithms developed to predict whether an applicant will be a good fit based on the types of employees who have been successful before could perpetuate implicit biases.

"If in the past you promoted tall white men or people who came from Harvard, that will come through in the algorithm," O'Neil said. "Algorithms just look for patterns."

The scoring is invisible, so even human resources departments don't know why an applicant might have been rejected, making it difficult for anyone to challenge the process, she said.

Much of the technology used in the hiring process shows great promise for helping employers cut costs associated with high turnover, said Natalie Pierce, co-chair of the Robotics, AI and Automation Industry Group at Littler Mendelson, a law firm that represents management.

One client, a department store that couldn't retain cosmetics-department employees, discovered through analytics that it had mistakenly assumed that hiring gregarious employees would lead to greater sales, when in fact the best salespeople were problem-solvers who invested time helping customers.

By changing the type of person it hired, the store was "greatly able to reduce training costs and attrition and increase the amount of commissions going to employees," Pierce said.

But employers have to be careful. Algorithms designed to identify candidates similar to current high performers could screen out groups of people who are protected by anti-discrimination laws.

At a public meeting held by the Equal Employment Opportunity Commission to discuss the issue in 2016, a chief analyst at the federal agency described how an algorithm might find patterns of absences among employees with disabilities.

Even if the algorithm does not intentionally screen out people with disabilities, the impact could be discriminatory and therefore violate federal law, said Barry Hartstein, co-chair of Littler's diversity practice.

"This is an area that the regulators are recognizing is the wave of the future," he said. Littler's growing AI team tests hiring algorithms to ensure they are having the intended outcomes.

The government has not filed any lawsuits based on an employer's use of high-tech screening tools or algorithms, said Carol Miaskoff, associate legal counsel at the EEOC. But the agency is watching the trend, and employers need to be aware if the tech tools they use to hire and promote are prone to discrimination, she said.

For Blatt, the Chicago man seeking a marketing position, the shift away from human interaction with recruiters has been frustrating.

He second-guesses how his answers will be scored on the multiple-choice personality tests. And he feels awkward when, during automated video interviews, the computer asks follow-up questions that don't relate to his previous answer.

That's not to say human screeners are all they are cracked up to be. Blatt recalled a recent phone interview with a woman who took the call at a noisy Starbucks, making for a distracted exchange.

"Compared to the woman at Starbucks, it might have been better to have a robot," he said.