Went to a talk about the Amazon Turk. Overall, the talk was ok, despite being a general introduction.
To start, some judgmental bitching on my part:
I am not sure I understood the purpose of this talk. All points were touched upon briefly and, as far as introductions go, Aaron Shaw carried the point across keeping the attention of, what looked like a fairly bored, even if a very intelligent-looking crowd. So the talk was ok – some humor, made his point, but what ensued then was just average mixed in with straightforward dull. And the answers were much too long and the substance… drove me up the freaking wall…
Here we have a technology that has been around for a while – on average, it allows people to generate a wage of somewhere between $3 and $4 an hour. At the same time, we have Iraq with about 20% unemployment significantly contributing to destabilization and, according to a friend of mine who worked there, a need for jobs providing $4 to $5 an hour… yet what was discussed? Questions/discussion included stuff like: "Is it ethical to pay someone 1 cent per task? That’s exploitation…" and "What about the ethical implications of using other people’s work that you pay little for but give no recognition" to the hilarious "Shouldn’t we discuss potential for racial discrimination?". AT FUCKING HARVARD! Where is the brilliant imagination and concern for humanity?! Did I miss something? Are these topics so jaded and did they cover, in the beginning of the talk, why the Turk model is not of significance for humanity? Don’t get me wrong: these may be all good subjects for discussion in a philosophy class, but how detached must you be to discuss ethics before socio-economic implications? Ugh…
I mean, either these people live in some sort of a protective bubble, or I am really missing something.
Although, to be fair, as far as being a douche, I contributed my fair share of idiotic commentary after the meeting… jumped to all sorts of conclusions about all sorts of people, but I am not a Harvard academic.
The atmosphere felt stale and rigid with people more concerned about their egos than about the actual issues. I don’t get it. Apparently, judging by the commentary at the bottom of this page, I was not the only one frustrated.
Enough bitching – I am becoming as petty as the people I chastise. Some constructive thoughts/questions that I think would be very interesting:
1. Why has this not been deployed internationally? (or has it?) According to Aaron, Amazon’s Turk is subject to the Patriot Act, and they dealt with it by requiring US-based accounts. Interesting point. I remember a good article about Ebay vs. Paypal (sorry, can’t find the link – main idea was that, because Paypal had nothing to lose, they broke lots of rules and were able to consequently beat out Ebay despite having an inferior product)- could this mean an opportunity for an Amazon competitor in this space who doesn’t have much to lose?
2. I still don’t quite understand the problems that are being solved through this system. I think it would be very productive to categorize and ascertain marketing strategy for how to drive project providers to use Turk models. My brother gave the Turk a try, but ran into low quality of results, which should be easily addressed through building a self-referential system: pretty much same models as ones that must have been used to verify quality of human subjects back in the day.
3. What are deployment strategies? I understand that there are mobile-phone-oriented strategies that may work? Could this be used to provide jobs in areas with high levels of poverty both in the developing and the developed world?
4. Could this be coupled with education systems so that people who participate must solve problems and read material that relates to problems at hand? So, for example, if you get people in Africa use this system, provide questions that would improve skill/knowledge levels in immediately applicable fields. Ex: agriculture, math, marketing, etc.
Lots of other questions should come to mind, but don’t as am tired and cranky and need sleep.