Thi Online Tổng hợp các dạng bài Đọc hiểu môn Tiếng anh lớp 12 cực hay có đáp án
- Đề số 1
- Đề số 2
- Đề số 3
- Đề số 4
- Đề số 5
- Đề số 6
- Đề số 7
- Đề số 8
- Đề số 9
- Đề số 10
- Đề số 11
- Đề số 12
- Đề số 13
- Đề số 14
- Đề số 15
- Đề số 16
- Đề số 17
- Đề số 18
- Đề số 19
- Đề số 20
- Đề số 21
- Đề số 22
- Đề số 23
- Đề số 24
- Đề số 25
- Đề số 26
- Đề số 27
- Đề số 28
- Đề số 29
- Đề số 30
- Đề số 31
- Đề số 32
- Đề số 33
- Đề số 34
- Đề số 35
- Đề số 36
- Đề số 37
- Đề số 38
- Đề số 39
- Đề số 40
- Đề số 41
- Đề số 42
- Đề số 43
- Đề số 44
- Đề số 45
- Đề số 46
- Đề số 47
- Đề số 48
- Đề số 49
- Đề số 50
- Đề số 51
- Đề số 52
- Đề số 53
- Đề số 54
- Đề số 55
- Đề số 56
- Đề số 57
- Đề số 58
- Đề số 59
- Đề số 60
- Đề số 61
- Đề số 62
Topic 26: Artificial intelligence (Phần 2)
-
8776 lượt thi
-
16 câu hỏi
-
60 phút
Câu 1:
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
Which best serves as the title for the passage?
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
Which best serves as the title for the passage?
Đáp án D
Câu nào thích hợp nhất làm tiêu đề cho đoạn văn?
A. Thái độ phán xét của con người với AI
B. AI bị xâm phạm quyền riêng tư quá nhiều
C. Tại sao bạn không bao giờ nên làm xấu chính phủ?
D. Trí tuệ nhân tạo đang theo dõi và đánh giá chúng ta
Căn cứ vào thông tin đoạn đầu:
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control.
(Đừng nhìn bây giờ, nhưng trí tuệ nhân tạo đang theo dõi bạn. Trí tuệ nhân tạo có sức mạnh to lớn để tăng cường gián điệp, và cả chính phủ độc tài và dân chủ đang áp dụng công nghệ như một công cụ kiểm soát chính trị và xã hội.)
=> Như vậy, đoạn văn đang nói về việc trí tuệ nhân tạo đang theo dõi và kiểm soát chúng ta.
Câu 2:
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
The word “leanings” in paragraph 1 is closest in meaning to _______.
Đáp án A
Từ “leanings” trong đoạn 1 có nghĩa gần nhất với __________.
A. bias /ˈbaɪəs/ (a): thiên vị (mang tính ưu tiên không công bằng cho cả hai đối tượng khi đưa ra đánh giá); thiên hướng, khuynh hướng thiên về một cái gì cụ thể
B. preference /ˈprefrəns/ (n): sự thích hơn, ưu tiên cái nào hơn cái nào (mang tính chất so sánh giữa của cá nhân giữa các sở thích bản thân)
C. fondness /ˈfɑːndnəs/ (n): sự thích, yêu
D. link /lɪŋk/ (n): mối quan hệ, mối liên kết
=> Từ đồng nghĩa: leaning /ˈliːnɪŋ/ (n): khuynh hướng, xu hướng thích, thiên về một vật/chủ đề/lĩnh vực cụ thể ~ bias
=> Theo ngữ cảnh thì ta dùng “bias” là chính xác hơn vì nó mang hàm nghĩa “thiên hướng, khuynh hướng thiên về một cái gì cụ thể” chứ không mang tính chất so sánh, lựa chọn cá nhân về một thú vui, sở thích nào đó.
Tạm dịch: "The same data can also reveal someone’s personal life and political leanings to the authorities."
(Dữ liệu tương tự cũng có thể tiết lộ cuộc sống cá nhân của ai đó và khuynh hướng chính trị là thiên về chính quyền.)
Câu 3:
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
The word “persecution” in paragraph 2 is closest in meaning to _______.
Đáp án D
Từ “persecution” trong đoạn 2 có nghĩa gần nhất với .
A. chuyên chế
B. hình phạt
C. tra tấn
D. phân biệt đối xử
Từ đồng nghĩa persecution (sự ngược đãi) = discrimination
Accurate or not, such a tool creates a new opportunity for persecution.
(Chính xác hay không, một công cụ như vậy tạo ra một cơ hội mới cho sự ngược đãi.)
Câu 4:
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
The word “it” in paragraph 2 refers to _____.
Đáp án D
Từ “it” trong đoạn 2 nói đến .
A. dụng cụ
B. cơ hội
C. ảnh chụp
D. công nghệ
Từ “it” dùng để thay thế cho danh từ công nghệ được nhắc tới trước đó.
“Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist.
(“Sử dụng loại công nghệ này, đưa nó vào hệ thống camera giám sát quan sát toàn thành phố và đến một nơi như Ả Rập Saudi, nơi mà đồng tính được coi là một tội ác,” Lisa Talia Moretti, một nhà xã hội học kỹ thuật số nói.)
Câu 5:
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
According to paragraph 3, which job is NOT performed by AI surveillance system in China?
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI.
An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”
No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says.
This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s advice, and set him free, they agreed to impose the algorithm’s recommended curfew. “I’m still haunted by COMPAS,” Rodriguez warns.
(Source: https://www.wired.com/)
According to paragraph 3, which job is NOT performed by AI surveillance system in China?
Đáp án B
Theo đoạn 3, công việc nào không được thực hiện bởi hệ thống giám sát AI ở Trung Quốc?
A. Theo dõi ngoài vòng pháp luật thông qua nhận dạng sinh trắc học của khuôn mặt
B. Xác định tín đồ của Hồi giáo để bí mật xử lý
C. Phơi bày những người phạm tội nhẹ với công chúng
D. Hành động như là phương tiện cho cơ quan chủ quản để giám sát công dân
Căn cứ vào thông tin đoạn ba:
The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses.
(Ngành công nghiệp AI đã phát triển mạnh nhờ sự cạnh tranh khốc liệt và khả năng tiếp cận dữ liệu cá nhân không có đối thủ, và sự trỗi dậy của AI đang cho phép chính phủ kiểm soát chặt chẽ hơn thông tin, lời nói và quyền tự do của công dân. Ở một số thành phố của Trung Quốc, nhận dạng khuôn mặt được sử dụng để bắt tội phạm trong các cảnh quay giám sát và công khai những người phạm tội nhẹ.)
Bài thi liên quan:
Topic 1: Family life
56 câu hỏi 60 phút
Topic 2: Friendship
57 câu hỏi 60 phút
Topic 3: Relationships
50 câu hỏi 60 phút
Topic 3: Relationships (Phần 2)
16 câu hỏi 60 phút
Topic 4: Being independent
36 câu hỏi 60 phút
Topic 5: Experiences
41 câu hỏi 60 phút
Topic 6: Gender equality
62 câu hỏi 60 phút
Topic 6: Gender equality (Phần 2)
8 câu hỏi 60 phút
Topic 7: Life stories
56 câu hỏi 60 phút
Topic 7: Life stories (Phần 2)
15 câu hỏi 60 phút
Topic 8: Ways of socializing
41 câu hỏi 60 phút
Topic 9: Generation gaps
70 câu hỏi 60 phút
Topic 10: Volunteer
59 câu hỏi 60 phút
Topic 10: Volunteer (Phần 2)
8 câu hỏi 60 phút
Topic 11: Entertainment
56 câu hỏi 60 phút
Topic 11: Entertainment ( Phần 2)
16 câu hỏi 60 phút
Topic 12: Books
61 câu hỏi 60 phút
Topic 12: Books ( Phần 2)
16 câu hỏi 60 phút
Topic 13: Healthy lifestyle and longevity
50 câu hỏi 60 phút
Topic 13: Healthy lifestyle and longevity ( Phần 2)
56 câu hỏi 60 phút
Topic 14: Celebration
64 câu hỏi 60 phút
Topic 14: Celebration ( Phần 2)
8 câu hỏi 60 phút
Topic 15: Tourism
48 câu hỏi 60 phút
Topic 16: Mass media
50 câu hỏi 60 phút
Topic 16: Mass media ( Phần 2)
61 câu hỏi 60 phút
Topic 17: Culture
52 câu hỏi 60 phút
Topic 17: Culture ( Phần 2)
74 câu hỏi 60 phút
Topic 17: Culture ( Phần 3)
16 câu hỏi 60 phút
Topic 18: Education
51 câu hỏi 60 phút
Topic 18: Education (Phần 2)
58 câu hỏi 60 phút
Topic 19: A new way to learn
56 câu hỏi 60 phút
Topic 19: A new way to learn ( Phần 2)
18 câu hỏi 60 phút
Topic 20: Sports
55 câu hỏi 60 phút
Topic 20: Sports ( Phần 2)
40 câu hỏi 59 phút
Topic 21: Jobs
50 câu hỏi 60 phút
Topic 21: Jobs (Phần 2)
56 câu hỏi 60 phút
Topic 22: Life in the future
60 câu hỏi 60 phút
Topic 23: Energy
47 câu hỏi 60 phút
Topic 24: Population
60 câu hỏi 60 phút
Topic 25: Urbanization
55 câu hỏi 60 phút
Topic 26: Artificial intelligence
62 câu hỏi 60 phút
Topic 27: Pollution
62 câu hỏi 60 phút
Topic 27: Pollution (Phần 2)
16 câu hỏi 60 phút
Topic 28: Nature in danger
65 câu hỏi 60 phút
Topic 28: Nature in danger (Phần 2)
16 câu hỏi 60 phút
Topic 29: Endangered species
71 câu hỏi 60 phút
Topic 29: Endangered species (Phần 2)
8 câu hỏi 60 phút
Topic 30: Environments
46 câu hỏi 60 phút
Topic 31: Global warming
62 câu hỏi 60 phút
Topic 31: Global warming (Phần 2)
8 câu hỏi 60 phút
Topic 32: Conservation
54 câu hỏi 60 phút
Topic 33: Green movements
49 câu hỏi 60 phút
Topic 34: Places of interest
50 câu hỏi 60 phút
Topic 34: Places of interest (Phần 2)
56 câu hỏi 60 phút
Topic 35: International organization
65 câu hỏi 60 phút
Topic 36: Inventions
56 câu hỏi 60 phút
Topic 36: Inventions (Phần 2)
40 câu hỏi 60 phút
Topic 37: Space conquest
65 câu hỏi 60 phút
Topic 38: Research
50 câu hỏi 60 phút
Topic 38: Research (Phần 2)
50 câu hỏi 60 phút
Topic 39: History
60 câu hỏi 60 phút
Các bài thi hot trong chương:
Đánh giá trung bình
0%
0%
0%
0%
0%