https://nearlyright.com/feed 10月27日 04:21
英国网络安全法案模糊不清,导致服务商选择屏蔽英国用户
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

英国新出台的《在线安全法案》因其模糊的定义和缺乏明确的门槛,给全球网络服务商带来了巨大的合规压力。该法案要求与英国有“关联”的服务采取安全措施,但并未界定“显著数量”的英国用户标准。这迫使像Libera.Chat这样的非营利组织花费巨资寻求法律咨询,却仍无法获得确定答案。面对高昂的合规成本和法律风险,许多服务商选择直接屏蔽英国用户,而非投入资源进行复杂的合规操作。这种做法导致合法用户访问受限,社区论坛关闭,开源项目协调受阻,反而与法案旨在提升在线安全的初衷背道而驰,使得英国成为部分在线服务选择规避的市场。

⚖️ **法律模糊性导致合规困境**:英国《在线安全法案》未明确界定“显著数量”的英国用户,迫使全球服务商,即使是总部不在英国的非营利组织,也需花费巨资寻求法律意见,但仍面临不确定性。这种模糊性本身成为了实施监管的方式,让服务商自行判断是否受管辖。

🚫 **屏蔽用户成为规避合规的选择**:由于合规成本高昂且风险难以评估,许多服务商选择直接屏蔽英国用户(geo-blocking),而非投入资源进行复杂的安全措施。监管机构接受屏蔽用户作为解决安全问题的方案,反而鼓励了这种“不作为”式的合规。

💔 **社区与小型服务面临生存危机**:该法案的成本对小型社区和非营利组织而言是毁灭性的,导致如自行车爱好者论坛、开源社区协调平台等关闭,数万用户失去重要的社交和支持网络。大平台尚能承受合规成本,但小规模社区则难以负担。

🌍 **法案的域外管辖引发国际争议**:英国试图将其法律适用于全球服务,即使是与英国联系微乎其微的服务也可能被卷入。这种做法可能导致互联网碎片化,服务设计被影响,并引发其他地区(如泽西岛)对其有效性的质疑。

⚠️ **“安全”措施可能带来新风险**:为规避法案,用户可能转而使用VPN等工具,这些工具可能存在数据隐私风险或提供虚假安全保障。此外,合法用户被排除在外,而真正需要帮助的用户(如寻求自杀预防资源者)反而无法获得支持,这些都是法案在实践中带来的意外负面后果。

Britain's vague internet law pushes services to block UK users rather than comply

Regulatory uncertainty forces legitimate platforms to choose between expensive lawyers, preemptive blocking, or closure

A Swedish non-profit that helps open source developers coordinate software projects recently hired lawyers to answer a peculiar question: does British law apply to them? Libera.Chat operates from Sweden, banks in Sweden, and serves a global community of programmers who use Internet Relay Chat to build free software. Yet Britain's Online Safety Act forced them to seek legal advice about whether they're subject to UK regulation.

The answer, after consultation: probably not, but nobody can say for certain.

This uncertainty isn't a bug in the legislation. It's how the law works. The Act requires services with "links to the UK" to implement safety measures, but never defines what constitutes a "significant number" of UK users—the main test for whether those links exist. Ofcom, the regulator, says platforms should simply "explain their judgment" about whether they're in scope. For organisations trying to follow the law, this amounts to: figure it out yourself, hire lawyers if you can afford them, and hope Ofcom agrees with your reasoning.

Libera.Chat chose to share its legal advice publicly, reasoning that other online communities face identical uncertainty. That transparency reveals something striking: when even a Sweden-based non-profit serving software developers needs lawyers to determine if UK law governs them, regulatory vagueness has become the regulation itself.

A test with no threshold

The Online Safety Act sweeps astonishingly wide. Any service letting users share content or interact falls within scope—social media, messaging apps, forums, gaming servers, IRC networks. Ofcom estimates over 100,000 services worldwide must comply.

Yet the law provides no numerical threshold for "significant" UK users. Is it 1,000 users? 10,000? One per cent of your user base? Ten per cent of Britain's population? The legislation doesn't say. Ofcom won't clarify.

Libera.Chat's lawyers suggested "significant" measures against Britain's total population, not a service's user base. Under this reading, a platform with 10,000 users worldwide—3,000 of them British—wouldn't have "significant" UK users relative to 68 million Britons, despite British users comprising 30 per cent of its entire audience. This interpretation should limit the Act to only the largest platforms.

Yet Ofcom estimates 100,000 services are in scope. Either the population-relative interpretation is wrong, or services must assume they're subject to UK law unless they can prove otherwise—a reversal of normal regulatory principles. Instead of regulators demonstrating jurisdiction, platforms must demonstrate its absence.

For organisations without legal departments, this creates impossible choices. Pay lawyers for advice that might not provide certainty. Implement compliance measures that might be unnecessary and certainly are expensive. Or eliminate the question entirely by blocking British users.

When exclusion becomes compliance

Four file-sharing services faced Ofcom investigations over child safety concerns. Their response: geo-block UK internet addresses. Ofcom closed the investigations, satisfied that blocking had "significantly reduced the likelihood that people in the UK will be exposed to any illegal or harmful content."

Consider what this means. The regulator treats denying access to everyone as equivalent to protecting users through safety measures. One approach blocks legitimate users alongside harmful content. The other implements targeted protections whilst maintaining access. By accepting geo-blocking as satisfactory, Ofcom signals that territorial exclusion achieves the law's goals as effectively as actual safety improvements.

The pattern repeats. Gab blocked UK users. Civit.ai blocked UK users. A suicide prevention forum began geo-blocking when enforcement proceedings started. Ofcom expressed satisfaction that the forum is no longer accessible from Britain—apparently unconcerned that people seeking suicide prevention resources now cannot reach it.

This reveals the law's practical operation. Platforms escape British regulation not by making their services safer, but by making themselves unavailable to British people. The legislation creates a perverse incentive: blocking UK users is easier and cheaper than compliance, particularly when the regulator explicitly accepts blocking as a solution.

British users increasingly face barriers to legitimate online communities, especially smaller platforms and specialist services. When Ofcom's own enforcement actions push services toward geo-blocking, the law achieves something close to the opposite of making Britain "the safest place to be online." It makes Britain a place many online services simply refuse to serve.

The death of small communities

London Fixed Gear and Single Speed wasn't a business. It was a community hub where bicycle enthusiasts met partners, supported each other through illness, reunited owners with stolen bikes, and provided mental health support. Members credited it with preventing suicide. Yet the forum shut down rather than face the Online Safety Act's requirements.

The arithmetic was brutal. Legal compliance would cost tens of thousands of pounds. The forum received a few hundred pounds monthly in donations. Age verification alone would cost £2,400 annually—just for the external service, before legal advice, before staff time, before liability insurance. The same regulatory requirements that Meta absorbs without noticing destroyed a volunteer-run community space.

Microcosm, hosting nearly 300 community forums for 275,000 users, announced similar closures. The operator couldn't accept personal criminal liability for failing to comply with Ofcom enforcement notices—a risk the Act imposes on senior managers. Thousands of people lost community spaces serving as support networks, hobby groups, and mutual aid organisations.

The Act includes proportionality provisions. Smaller services supposedly face lighter obligations than large platforms. But "lighter" doesn't mean "affordable." Even reduced compliance requirements impose costs beyond volunteer communities' means. A forum getting hundreds in donations monthly cannot afford thousands in annual compliance costs, no matter how proportional those costs seem relative to Meta's burden.

The consolidation seems almost designed. Problems motivating the Online Safety Act—harmful content, inadequate moderation, user exploitation—stem primarily from large commercial platforms, not bicycle forums or open source IRC networks. Yet uniform regulatory requirements hit small communities hardest whilst large platforms continue operating with compliance costs that barely register against revenues of billions.

Claiming the world

Britain's approach asserts authority over any service accessible to British users. This creates friction. Libera.Chat—Swedish organisation, Swedish legal status, Swedish banking—must consider whether British regulations apply because British developers use IRC to coordinate software projects.

The extra-territorial reach affects service design. Platforms might avoid features interpreted as targeting UK users. They might implement systems to track and limit British audiences. Some conclude any UK users create unacceptable regulatory risk. A British law discouraging services from serving British people.

Jersey, a Crown dependency, declined to enforce the Act in its territory, citing "inadequacies" in the legislation. Even local authorities within British jurisdiction question the law's workability. Wikipedia challenged the Act in court, arguing compliance would compromise its open editing model and invite state censorship. The foundation lost, but the challenge itself signals how far the Act's reach extends.

Individual countries can regulate global platforms through extra-territorial legislation, but early evidence suggests this approach creates fragmentation rather than safety. Services respond by geo-blocking entire countries. The internet balkanises, access determined by physical location rather than technical capability.

Unintended casualties

Open source software development relies on IRC and similar platforms for project coordination. When these services face British regulatory uncertainty or choose to block UK users, British developers need workarounds. Libera.Chat noted this explicitly: UK individuals shouldn't require censorship-defeating proxies just to engage with free software communities.

VPN downloads surged 1,800 per cent when age verification requirements took effect. Users, including minors the law aims to protect, learned to circumvent geo-restrictions. Many free VPN services harvest user data or provide false anonymity. By pushing people toward these tools, the legislation exposes users to different harms.

The pattern shows regulatory vagueness creating worse outcomes than targeted regulation. Services face impossible choices. Legitimate platforms block UK users. Community spaces close. Developers need workarounds for project infrastructure. None of these outcomes advance protecting users from harm.

Libera.Chat's transparency—publishing legal advice about regulatory uncertainty—provides a model. But it also highlights the problem: even professional legal counsel cannot provide definitive answers about whether British law applies. This uncertainty itself functions as regulation, pushing services toward the safest legal option: unavailability to British people.

The law's architects presumably didn't intend to incentivise geo-blocking or force forum closures. Yet these consequences flow directly from the legislation's structure. Vague definitions, undefined thresholds, and regulatory acceptance of territorial exclusion combine to make rational business decisions produce perverse outcomes. The safest legal path increasingly means denying access to British users rather than attempting compliance with rules nobody can definitively explain.

The Online Safety Act aimed to make Britain the safest place online. Instead, it's making Britain the place online services increasingly choose to avoid.

#politics #technology

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

英国在线安全法案 网络安全 互联网监管 服务屏蔽 合规风险 社区关闭 Online Safety Act Cybersecurity Internet Regulation Service Blocking Compliance Risk Community Shutdown
相关文章