World

Oil Updates – prices fall for second week as Middle East worries ease, China demand seems fragile

[ad_1]

RIYADH: The landscape of cyberthreats is constantly evolving, posing an increasing risk to one of society’s most vulnerable groups: children. Today’s youth, often referred to as digital natives, have unfettered access to the internet, accompanying the exponential rise in cyberthreats.

The risks are manifold: cyberbullying, online predators, exposure to inappropriate content, and privacy breaches. The anonymity of the internet emboldens perpetrators, fostering an environment ripe for the exploitation of minors. Cybercrime is on a steep incline, anticipated to cost the world a colossal $8 trillion in 2023, making it the equivalent of the third-largest economy globally if it were a country, according to Cybersecurity Ventures.

These issues were tackled at the third Global Cybersecurity Forum that concluded in Riyadh on Thursday. Hosted by the National Cybersecurity Authority and the Saudi Information Technology Co., this year’s forum discussed some of the most pressing challenges in the realm of cyberspace.

Participants delved into the repercussions of these challenges across various sectors, with a special emphasis on supply chains and the rapidly evolving landscape of smart cities.

The GCF also aimed to encourage multi-stakeholder collaboration on an international scale, gathering industry experts, decision makers, CEOs, senior government and academic representatives  as well as international companies from over 120 countries.

Tech companies, empowered by advancements in AI, hold considerable responsibility within this ecosystem. They are tasked with the design of products featuring robust security and privacy controls and must be swift to counteract threats. These entities are pivotal in safeguarding young users and are well-placed to develop AI tools that can monitor and restrict harmful content and behaviors.

A key initiative is Security by Design, a crucial framework in technology development, particularly when it comes to protecting young users online. For child protection, this means technology is created with the highest default privacy settings, minimal data collection, and stringent content filtering to bar harmful material. This approach involves the use of AI for real-time content and behavior monitoring, transparent policies, and user data control. It requires tech companies to stay vigilant and responsive to ensure swift action on security threats.

When these practices are incorporated, they can substantially reduce the risks to young users. Prioritizing security from inception builds consumer trust and lays the groundwork for a safer digital environment, allowing children to learn, explore, and connect with minimal exposure to online threats. Moreover, a proactive security stance helps companies meet international regulations and prevent the high costs and reputational damage of security failures.

“I think over the past five years we’ve seen a great improvement in the levels of protections, safeguards that have been put in place,” said Iain Drennan, executive director of the WeProtect Alliance. However, Drennan notes a lack of consistent transparency across companies.

“There was a very good report that the OECD put out recently that shows that it’s only a minority of the major service providers that are being used by children that have a consistent level of that input,” he said.

AI has the potential to dramatically improve online safety, according to Drennan: “And I think with AI, you’ve got an ability to process huge volumes of information and flag up abnormal behaviors, you train it with the right data, you put main indicators, and that then can act as a support for human moderators.”

He advocates for multinational cooperation and knowledge sharing to combat this global crime that ignores borders.

The urgency of this issue is highlighted by the DQ Institute’s 2023 Child Online Safety Index, revealing nearly 70 percent of children and adolescents aged 8-18 worldwide have encountered at least one cyber risk in the past year — a figure that has stubbornly remained consistent since the Index’s inception in 2018. This is termed a “persistent cyber pandemic.”

Digital skills are essential, yet as Yuhyun Park, founder of the DQ Institute, explained to Arab News: “When people think about skills, digital skills, a lot of people think of it as a coding skill, which is important in computational thinking. But digital skills start with digital citizenship. So, before you know how to use and create technology, you need to be a good digital citizen who can use technology in a safe, responsible, and ethical way.”

Park likens digital citizenship to a passport necessary for ethical participation in the digital world.

The DQ Institute’s study, gathering data from over 350,000 children globally, aims to inform policymakers with a precise overview of child online safety measures worldwide. The UK, Germany, and China emerged as top performers in the study’s assessment.

Park noted that while cyberattacks have long been rising, it was not until the COVID-19 pandemic that the scale of immersion in digital spaces became fully apparent.

“So, what is actually quite important for children to understand is that they need to protect themselves at a very early age,” she said, highlighting the changed landscape that children now navigate compared to previous generations.

A comprehensive approach that includes regulatory frameworks, digital literacy education, technology design, international collaboration, and open dialogue is vital for creating a safe online space. Insights from the DQ Institute and similar organizations can significantly bolster global efforts to shield young netizens.

Saudi Arabia has made notable strides, as evidenced at the Global Cybersecurity Forum where the 2023 COSI was launched. The country has excelled in children’s safe use of technology and ICT company responsibilities.

The darker facets of online platforms, like gaming, social media, and chat rooms, are avenues for cyberbullying and grooming. Phishing and scams can deceive children into disclosing sensitive information, drawing them into criminal activities unwittingly.

Even toys and devices under the IoT umbrella are becoming conduits for risk.

“When we talk about cyber criminals and maybe specifically hackers, what we see is degrees of impulsive and compulsive behaviors,” Dr. Mary Aiken, a cyberpsychologist at Capitol Technology University, told Arab News.

“So, for the first time, we’re looking at a cybercrime scene and looking at the exploit that was used as a weapon of choice, but we’re now observing a new breed of criminals who are targeting children for commercial interests,” Aiken added.

She acknowledges the importance of regional best practices and the need for protective legislation.

“It’s only a matter of time before entities that profit from online platforms and services recognize a legal duty of care to protect children,” Aiken explained. With the EU Digital Services Act, she sees an opportunity for the development of similar culturally sensitive legislation in other regions.

The battle against cyberthreats to children demands a concerted effort from all sectors — technology firms, educators, governments, and international organizations. It is imperative to forge a united and active defense strategy to ensure the digital well-being of our youth in an increasingly connected world.

[ad_2]
Source: Arab News

Related Articles

Leave a Reply

Back to top button