Perhaps the op needs to define the "perfect AI." Because in so far as I understand it, that creature will never exist to me.
Perhaps the op needs to define the "perfect AI." Because in so far as I understand it, that creature will never exist to me.
Somewhat, businesses know they can't maximize profits by doing illegal or highly unethical things that would cause boycotts, reduce sales, or lower their availability to the best employees. Those days are gone in societies that have freedom of expression, information, and communication.
- - - Updated - - -
Yeah there can't be an infallible human or artificial intelligence because it's based on reducing explanatory/predictive error rates down to zero. It's impossible though because it would require a computer(universal Turing machine) that has an infinite amount of memory and information processing speed.
Good thing the modern AI has a capability to assess and address all these nuisances while maximising the output.
This is also fine because if you are willing so the AI's ability to process immense amount of data would help you to run your charity venture the most charitable way as well
Still wrong because AI doesn't learn from actor, it learns from data sets. Output doesn't match the data sets either, it is supposed to meet a certain accuracy requirement while maximising the desired metric. I can hardly imagine a company willing to sacrifice maximising individual's performance in favor of not hiring people from certain "bad" university, and if they are willing to do so they would only hurt their own business
As I wrote, but the data sets are based on their actions. Data-sets don't just appear from no-where, that's why some companies are valued so highly - since they have good data sets.
You might be surprised by the real world. As for it being bad: there are two important factors - one is how much you hurt the business by recruiting suboptimally - and the second is how time you spend on this; using a simple heuristic might lose you a few good candidates, but on the other hand everything takes time - so perhaps it's better to skip them and spend more time on the remaining candidates.
You are not programming the data sets, the data is collected from events and objects in real world. The data sets can only be limited. If you ever heard of Laplace's demon, that's an example of ideal AI predictor having all the data in the universe and capability to process it. Being an extreme, it is the direction AI development goes
Last edited by Yadryonych; 2021-03-29 at 06:33 PM.
AI can and should be used by large corporations to scan the initial batch of regular applications. Otherwise, your application at Google, Microsoft, Samsung, etc. etc. would either be never processed or only months/years later. How many applications do you think do they receive per day?
But once that first application phase is over, a human must take over. Only human to human communication can give you a proper picture of the person applying. AI MIGHT be used during those later stages in combination with assessment centers, but the final decision should be made by a human respecting both the AC performance and the interpersonal performance.
On the firing side of things, an AI firing you outright should be illegal. An AI can be used to determine bad performances in KPI driven environments, but at most their job should be to trigger a review process with your management line and again in personal conversations a professional must evaluate the reasons for the lack of performance. An AI will never understand your personal circumstances, but humans will. A good employer will consider that in their decision making.
That's my take on it at least. A fully automated hire and fire AI process is literally what my nightmares are made of.
Artificial Intelligence should not be used for hiring, human resources (HR), or firing.
Those are the places you would definitely want to see a human in a position of.
FOMO: "Fear Of Missing Out", also commonly known as people with a mental issue of managing time and activities, many expecting others to fit into their schedule so they don't miss out on things to come. If FOMO becomes a problem for you, do seek help, it can be a very unhealthy lifestyle..
I used to work somewhere that they would run every application through a computer to test for a reference from the most recent supervisor. If the application did not have that reference returned it trash canned the application and never got to a human to look at.
The effect was they never saw any applications for that position except maybe one or two.
The position had tons of openings so much that the company started offering bonuses if you applied and were hired. They needed close to 50 positions filled in a given month. But "no one was applying". Turns out the computer was kicking hundreds of applications that were perfect for the positions they just never saw them because "AI" rejected them.
FOMO: "Fear Of Missing Out", also commonly known as people with a mental issue of managing time and activities, many expecting others to fit into their schedule so they don't miss out on things to come. If FOMO becomes a problem for you, do seek help, it can be a very unhealthy lifestyle..
That article seems to confirm the AI being pure garbage:
Problems with the data that underpinned the models’ judgments meant that unqualified candidates were often recommended for all manner of jobs, the people said. With the technology returning results almost at random, Amazon shut down the project, they said.
No.
What should be made illegal is firing thousands of employees on one end and giving executives big bonuses on the other.
“The biggest communication problem is we do not listen to understand. We listen to reply,” Stephen Covey.
unfortunately your view is going to be proven completely wrong. humans make tons of bad judgement calls when evaluating talent. i can easily see a well programmed AI exclusively making hiring choices. although i think the first thing an AI would do is scrap the current hiring model in favor of performance evaluation tests to determine the correct candidate, using criteria we currently do not use.
TO FIX WOW:1. smaller server sizes & server-only LFG awarding satchels, so elite players help others. 2. "helper builds" with loom powers - talent trees so elite players cast buffs on low level players XP gain, HP/mana, regen, damage, etc. 3. "helper ilvl" scoring how much you help others. 4. observer games like in SC to watch/chat (like twitch but with MORE DETAILS & inside the wow UI) 5. guild leagues to compete with rival guilds for progression (with observer mode).6. jackpot world mobs.
well better AI then incompetent girls from HR . honestly from what i have seen its most incompetent departemnt in very single company
and no im not sexist - i have yet to see HR department where 90% of employees arent women
honestly it will be benefit both for companies and for employees.