That's a very judgmental reply that also misrepresent my comment.
I don't work at Google, so I cannot speak for them. My company does have studies and experiments around interviewing, but again, the metric they use for success is "inclined rate" e.g. doing ABC leads to x% increase in inclined rate. I haven't seen any study linking interview outcomes with on the job performance e.g. yearly performance rating or promotions. In fact many people have requested this data over the years.
> If you haven't seen any data then you just have a set of anecdotes, so your position is very weak. I have seen the data, but a person like you who is convinced it work wont believe me of course.
This comment seems quite rude, and again, it misrepresents my position. I didn't say LC style interviews don't work, I'm saying that I don't know, because I have seen no data to prove that they work better or worse than other alternatives.
Also the landscape has change, and what might have worked 5 or 10 years might not anymore. Tech interviewing has "professionalized" in a Goodhart's Law-esque fashion. People have become very dedicated to game the process, which might affect the quality of the outcomes. But just to be clear again, I want to see evidence one way or another.
> I haven't seen any study linking interview outcomes with on the job performance e.g. yearly performance rating or promotions. In fact many people have requested this data over the years.
But that is exactly what Google did, they linked the interview results to later performance evaluations and that is the correlation I'm talking about. The better they scored on the interviews the better the performance reviews they got later. The correlation was very stable and didn't just appear at low levels.
Your company might not have done such a study, but Google did.
> Also the landscape has change, and what might have worked 5 or 10 years might not anymore. Tech interviewing has "professionalized" in a Goodhart's Law-esque fashion. People have become very dedicated to game the process, which might affect the quality of the outcomes. But just to be clear again, I want to see evidence one way or another.
That is true, the data I saw is nearing a decade old now, but at least the data back then showed that it worked.
> The better they scored on the interviews the better the performance reviews they got later.
One possible explanation of this is that people who did well in the hiring process were valuable employees. Here's another: people who were good at gaming an evaluation system run by Google were good at gaming an evaluation system run by google.
I don't work at Google, so I cannot speak for them. My company does have studies and experiments around interviewing, but again, the metric they use for success is "inclined rate" e.g. doing ABC leads to x% increase in inclined rate. I haven't seen any study linking interview outcomes with on the job performance e.g. yearly performance rating or promotions. In fact many people have requested this data over the years.
> If you haven't seen any data then you just have a set of anecdotes, so your position is very weak. I have seen the data, but a person like you who is convinced it work wont believe me of course.
This comment seems quite rude, and again, it misrepresents my position. I didn't say LC style interviews don't work, I'm saying that I don't know, because I have seen no data to prove that they work better or worse than other alternatives.
Also the landscape has change, and what might have worked 5 or 10 years might not anymore. Tech interviewing has "professionalized" in a Goodhart's Law-esque fashion. People have become very dedicated to game the process, which might affect the quality of the outcomes. But just to be clear again, I want to see evidence one way or another.