Daniel Osborne: China, April 2019

人工智慧  : 当今媒体的热点话题之一

最近,在微博上,一张图片吸引了许多人的关注:在教室里,30个学生面对黑板坐着。看起来是一个典型的教室。可是,一些矩形被叠加在每一个学生的脸上。一个看黑板的学生的绿矩形里面写着“身份:000010,第一个状态:专心“。一个站在桌子的后面的女学生的蓝矩形里面写着,”身份:00001, 第三个状态:回答问题“。如同在从乔治•奥威尔的反乌托邦的科幻小说《1984年》里. 可是这个噩梦成真了。

2017年5月,AlphaGo,一种人工智慧的机器,打败了世界上最佳的围棋手。在硅谷发明的人工智慧的计算机赢了这场比赛变成了中国企业弥补中美之间的人工智慧差距的动机。然后,中国政府发表了一篇两万字的白皮书,表明中国人工智慧研发的战略。战略倡导在生活里所有的方面,比如医疗、法律、环保等行业与人工智慧融合。还有一个方面:‘智慧的教育‘。在七个试点学校里,在每一个教室里,一个很小白摄像头放在白板的上面。每秒一次,拍学生们的照片。然后,通过算法,能识别出每个学生的身份同时把行他们的行为分为五类:听、回答问题、写、跟同学交流、睡觉。

AlphaGo on track to beating the Go world champion, Ke Jie

AlphaGo on track to beating the Go world champion, Ke Jie

中国父母渴望了解自己的孩子在学校里的表现。上课是唯一时刻孩子们能避免父母的视野,使得父母担心极了。父母要控制自己的孩子的每一个方面,可是老师只有一双眼睛。因此,每天放学时,父母都问老师许多问题,比如,“今天他又睡觉了没有?” 或者 “他跟他的同学不停地聊天儿吗?” 现在,每一个星期,父母能取得算法分析孩子行为的数据。

主张‘智慧的教育‘的人坚持这种技术能够有效地监视学生的行为和集中力,因此能帮助提高他们学习效率。在中国教育体制重视考试:成绩是一个孩子的前途的关键,所以应该使用技术帮助学生。可是,老师们和家长却不同意:收集的数据是否真的可靠的?譬如,如果两个学生互相帮助或讨论老师的问题,被分为“分心”是否公平?

另外,人们也认为好像福柯的圆形监狱:只有一个看守人监视囚徒,可是囚徒不知晓何时被监视,使得他们的行为犹如永远被监视。教室里的摄像头具有同样的影响。学生因惩罚的恐惧而不敢打盹、开玩笑、打哈欠。他们不知道什么时候摄像头操作。他们不更专心反而设法尽心尽力地不睡觉。休息时,学生不去室外玩一玩而打瞌睡。在微博上,一个网民写道,“在马戏团里的猴子的笑容不是因幸福而是因恐惧。“期末考试前,有些学生抵抗校长,关掉了摄像头。

第三个问题是隐私保护。学校最可能没得到父母的许可,因此在上面所说的七个学校中的28000个学生不知道他们参加人工智慧的试验。缺乏许可可能遇到‘权力的不平衡‘。学校有惩罚与开除学生的权利。父母不愿意因为抵抗校长而冒破坏学生的前途的风险。因此学生沦为这场智能监控权博弈中.

可是在学校里人工智慧不限于在教室里。在未来,学生也能够用人脸识别买午餐、向图书馆借书、买自动贩卖机的饮料。在贵州,如今十个学校采用“智慧校服”, 用电子芯片监视学生的位置,鼓励学生多上课。在商店里,摄像头识别顾客,告诉员工这位顾客是不是稀客还是常客。然后,摄像头储存购物历史使得销售助理提高顾客的购物体验。

另外,警方现在采取了人工智慧。在青岛的啤酒节警察使用人脸识别分别逮捕了25个罪犯。可是人工智慧的机械故障的时候,可能会遇到不实指责与定罪、盗窃或诈骗罪。比如2018年11月,中国著名企业者,董明珠,对此可能深有体会, 因为一个吗路上的摄像头把喷涂在公交车外部地广告里她的头像识别了乱穿马路的人。

从我的立场来看,首先,人工智慧肯定能让我们的生活更方便、更安全,可是也抑制我们的独立,找到理想的妥协真的很棘手。我觉得跟西方人相比,中国人更容易接受人工智慧,因为他们已经对路上的触目皆是的摄像头与很强的权威更熟悉。其次,如果一个社会的教育根植于成绩、能只用统计来筛选学生,越来越多的学校愿意使用算法来判断学生的知识、行为等。可是,我怀疑如果学生感到压力山大,能引起可怜的结果,并不提高教育的效率。现在的中国孩子已经受到庞大的压力,人工智慧的摄像头可能会是最后一根稻草。。。

ENGLISH:

Artificial Intelligence –  one of today’s media’s most hotly debated topics

Recently, on Weibo, a photo has attracted lots of people’s attention. In a classroom, 30 students are sat facing the blackboard in what appears to be a normal classroom. But a few rectangles are superimposed over the students’ faces. For a student facing the blackboard, the inside of a green rectangle was written: “ID: 000010: Status 1: Concentrating.” For the student standing behind her desk, the inside of a blue rectangle read: “ID:00001 status 3: Answering Questions.” This may seem like a scene from George Orwell’s 1984, yet this nightmare has come true…

Monitoring the AI camera's results

Monitoring the AI camera’s results

In May 2017, AlphaGo, an AI machine, beat the world’s best Go player. The victory for this AI computer, invented in Silicon Valley, spurred the Chinese government to close the gap between the US and China in terms of artificial intelligence. Consequently, the Chinese government published a 20,000 word white paper, detailing its AI research and development strategy. The strategy advocated the integration of AI into many industries, such as medicine, law, environmental protection. One other aspect was also included: “intelligent education”. In seven trial schools, in every classroom, a small white camera is put above the whiteboard and takes a picture of the students once every second. Then using algorithms, it can distinguish every student’s identity as well as categorising their behaviour into 5 types: listening, answering a question, writing, talking with other students and sleeping.

Chinese parents desperately wanted to understand their child’s performance in school. Class-time is the only time when children can escape their parents’ gaze, which makes parents exceptionally anxious. Parents want to control every aspect of their child’s life, but the teacher only has one pair of eyes. As a result, when school ends at the end of each day, it is quite common for parents to bombard the teacher with questions, like ‘Did he fall asleep in class again today?’ or ‘Was he talking with his friend non-stop today?’. But now, once a week, parents can access their child’s algorithmically analysed behavioural data.

Advocates of ‘intelligent learning’ maintain that this type of technology can help improve a child’s behaviour and concentration and ultimately improve their learning efficiency. The Chinese educational system attaches great importance to exams: test scores are the key to a child’s future, so this kind of technology should be used to help the students achieve success. But teachers and parents disagree: are the data collected reliable? For example, if two students are discussing a teacher’s question or helping each other, is it fair for them to be classified as ‘distracted’?

Moreover, people think that this kind of situation resembles Foucault’s panopticon: only one guardsman watches over the prisoners, but the prisoners don’t know when they are being watched which makes them behave as if they were always being supervised on. The cameras inside the classroom have exactly the same effect: students don’t dare snooze, crack jokes or yawn through fear of punishment. Also they don’t know the cameras are operating, so don’t concentrate more but do everything they can not to fall asleep. As a result, when it’s their break, they don’t go outside and play but rather doze off. On Weibo, one netizen wrote: “The monkey in the circus is smiling because it’s happy; it’s smiling out of fear”. In one school, at the end of term exams, the students rebelled and turned the cameras off.

The third issue is privacy. It is likely that the school did not receive the parents’ permission, so in the aforementioned seven schools, there are 28,000 students who are unaware of the fact that they are taking part in an AI experiment. This lack of permission might lead to an imbalance of power whereby schools have the power to punish and expel and parents don’t dare stand up against this and run the risk of ruining their child’s future. Consequently, students become pawns in an ‘AI monitoring rights’ game.

Yet AI in schools is not limited to inside the classroom. In the future, students will be able to use face recognition to buy lunch, borrow books from the library and buy drinks from the vending machine. In Guizhou, nowadays ten schools have now adopted ‘intelligent school uniform’, using electronic chips to monitor students’ positions, to encourage students to attend class more. In shops, cameras can identify customers, and can tell whether the customer is a frequent customer or not. Then, the cameras store the shopping history which helps the sales staff improve the customer shopping experience.

The camera's footage of Dong Mingzhu: her face on it is in fact next to the rear wheel, as part of an advert

The camera’s footage of Dong Mingzhu: her face on it is in fact next to the rear wheel, as part of an advert

In addition, the police have also started to use AI. At the Qingdao Beer Festival, police used facial recognition to arrest 25 criminals. But when AI goes wrong, it can lead to untrue accusations, convictions and also theft and fraud. For example, in November 2018, a famous entrepreneur, Dong Mingzhu, experienced such a situation: a camera on the side of a street took a photo of her face, which was spray painted onto the outside of a bus as part of an advert, and identified her as a jaywalker.

From my point of view, firstly, AI can definitely make our lives easier and safer, but also restrict our independence. Finding the ideal compromise between will be polemic; I think that Chinese people are more accepting of AI compared to Westerners, since they are already more used to the omnipresence of CCTV cameras and very powerful state authority. Also, if a society’s education is rooted in test scores, and they only us statistics to determine a child’s intelligence, then of course more and more schools want to use algorithms to determine a child’s intelligence, behaviour and so on. But, I suspect that if students feel overwhelmed with pressure, then this could lead to some awful consequences, and not improve studying efficiency. Chinese children are already put under a lot of pressure, AI might just well be the last straw…

Tagged with: , , ,
Posted in Monthly Reports