Study: Kids with smartphones are less depressed, anxious, bullied than peers without them
https://alecmuffett.com/article/113154
#OnlineSafety #ScreenTime

Study: Kids with smartphones are less depressed, anxious, bullied than peers without them
https://alecmuffett.com/article/113154
#OnlineSafety #ScreenTime
There's still time to put pressure on the UK government.
The UK Secretary of State has the power to exempt small, safely moderated websites from the Online Safety duties.
We need urgent change to protect net plurality, rather than further consolidating power in monopoly platforms. We need competition for a safer Internet.
Write to your MP (UK) #SaveOurSites
https://action.openrightsgroup.org/save-our-sites-write-your-mp
The UK Online Safety Act comes into effect today.
Its onerous duties may cause many small sites, blogs and fedi instances to shut down or geoblock UK users when faced with potential fines and penalties.
This won't keep children safe. It'll benefit large platforms like Facebook and X that are laying waste to content moderation.
Moderation tools I wish were available at the user level:
1) the ability to reject and remove unwelcome comments on my thread.
2) the ability to send a toot with comments closed entirely, not everything has to be a discussion/argument.
When it comes to blogs, Ofcom says one thing, the UK Online Safety Act says another.
This lack of clarity over whether blogs with comments are exempt will push small sites to shut down completely.
We need the UK government to tighten up the definitions and exemptions in the Act.
Read our explainer for more detail https://www.openrightsgroup.org/blog/save-our-sites-deadline-17-march/
Under the UK Online Safety Act, small blogs, forums and fedi instances are faced with disproportionate requirements to:
️ Check if they have UK users
️ Do a risk assessment on whether kids might access the content, or if CSAM or terrorist material might be posted in the comments
️ Put themselves at the risk of fines, and even prison sentences, if they fail to comply with Ofcom’s future directives
The UK Online Safety Act burdens small sites with duties and penalties that they can't shoulder. They'll shut down instead, stripping us of net plurality.
There’s a simple solution:
Exempt small, safely run blogs, forums and fedi instances
The government can do this now
The duties start TOMORROW – Write to your MP
https://action.openrightsgroup.org/save-our-sites-write-your-mp
Saddling small sites with the same duties as huge platforms means many will shut down in a hammer blow to net plurality.
We'll be left with the Sophie’s choice of monopoly services; the incubators of online harms.
URGENT: The UK government must change the Online Safety Act to protect safe, non-commercial blogs, forums and fediverse.
Write to your MP to #SaveOurSites
https://action.openrightsgroup.org/save-our-sites-write-your-mp
Will the last small site turn off the lights?
The UK Online Safety Act imposes the same duties and penalties on blogs, forums and fedi instances as huge platforms.
Many small, safely moderated sites will shut down or block UK users.
Crushing competition is the last thing we need for a safer Internet!
Act now before 17 March
https://action.openrightsgroup.org/save-our-sites-write-your-mp
Are blogs exempt?
While Ofcom suggests they are, the UK Online Safety Act is far from certain.
The result is that small scale, non-commercial blogs will simply shutter the window.
This will push users from safely moderated sites onto services like Facebook and X – the fountainhead of online harms.
Protect Net Plurality!
Broad brush duties under the UK Online Safety Act threaten any website with possible penalties.
Small, safe sites can't shoulder this regime. We'll see the lights going out on blogs and forums from 17 March with a devastating impact for online communities.
We must #SaveOurSites
https://www.openrightsgroup.org/blog/save-our-sites-deadline-17-march/
Carpe DM ?
End-to-end encryption = online safety. It keeps what we send on messaging apps secure from hackers and predators.
Tell Ofcom NOT to implement message scanning powers in their consultation.
You have until 5pm TODAY!
https://action.openrightsgroup.org/48-hours-tell-ofcom-practice-safe-text
Save Encryption. Save the World
Only by blocking message scanning technology on messaging apps can we ensure online safety!
End-to-end encryption prevents predators and hackers from weeding their way into our private lives.
We must #PracticeSafeText
https://www.openrightsgroup.org/blog/the-case-for-encryption/
Message scanning tech on everyone's phone would expose us to dangerous new threats.
How does that achieve online safety?
Tell Ofcom NOT to break end-to-end encryption in their consultation, ending Monday 10 March.
Use our tool to say #PracticeSafeText
https://action.openrightsgroup.org/48-hours-tell-ofcom-practice-safe-text
uBlockOrigin & uBlacklist Huge AI Blocklist
A huge blocklist of manually curated sites (1000+) that contain AI generated content, for the purposes of cleaning image search engines (Google Search, DuckDuckGo, and Bing) with @ublockorigin or uBlacklist. Also works on mobile (iOS, iPadOS, Android) via uBlacklist, as well as pihole/adguard.
The main Fediverse platforms offer various protection tools: Mute and Block. For example, on Mastodon, the mute function can be set for a specific period of time. Mute means you won't see that person's posts, but you'll still be notified if they mention or tag you. A Block, on the other hand, means you don't want to see anything related to that person anymore. It's a much more radical action, useful if that person is disrupting your experience, offending you, or displaying any other serious behavior. In such cases, it’s important to file a report so administrators can take action and learn more about the user.
Sometimes I see advice to block immediately anyone posting content you don't want to see. In my opinion, if the content is simply not of interest to you, a mute is enough, while a block is more appropriate for people displaying clearly negative behavior.
Many platforms also offer filters: you can choose which words or hashtags you don't want to see and, accordingly, hide them. Filters can help personalize and improve your overall social experience.
Educators, if you are teaching about Digital Literacy or Online Safety, @cyberlyra has put together amazing free resources about how to reclaim your data.
https://www.optoutproject.net/the-cyber-cleanse-take-back-your-digital-footprint/
#DigitalLiteracy #OnlineSafety #EdTech #EduTech #Educators #Edutooters #Homeschooling @education @edutooters
Reminder for all people who care about a safe social media environment.
Since Mastodon version 4.3.0 there are new notifications defaults we are not able to configure server wide. 'Unsolicited private mentions' and 'Moderated accounts' are both set to 'Filter'. We advice you to set both of these to 'Ignore'.
The reasons are that 'Unsolicited private mentions' (aka 'Unsolicited direct messages') are very spam and harass sensitive. We believe that users should explicitly allow this, so we find this a bad default. A filter means that you are still forced to review this notification, with the spammer or harasser still getting their way.
'Moderated accounts' are accounts that are limited for a good reason (probably someone reported them) or accounts that belong to a Mastodon/fediverse server that is limited/silenced (also for a good reason). Those accounts and servers are not that problematic that we had to suspend them, but still problematic enough that we had to hide them or give you a warning when you visit their profile. Although you can still follow them if you like fierce discussions. So for the most of us there is no reason to see those accounts on our timelines and we also don't want to be notified about their existence via filters.
I just realized that some Mastodon servers are based in the United States (see below)
Mastodon.social (316k MAU)
Mas.to (13k MAU)
Mastodon.online (12k MAU)
Techhub.social (5.6k MAU)
Mastodonapp.uk (3.9k MAU)
Universeodon.com (3.8k MAU)
Mastodon.cloud (2.5k MAU)
With things happening on the data privacy front in the United States, would you advise people to transfer to a non-US server?
The Pornhub Bans Are Not About Keeping Kids Safe | …frankly this goes for most online safety regulation…
https://alecmuffett.com/article/110823
#AgeVerification #OnlineSafety #censorship #pornhub