Jump to content

All things : New iPad and iPad Pro released


BenG
 Share

Recommended Posts

Popster to Popster 4,583
On 8/7/2021 at 2:00 PM, BenG said:

@Sir.Sim I don't know why you're yaas-reacting to a post that was deliberately meant to hurt me and I've made my feelings very clear about this post. You have reacted to the posts on the argument, which means you knew what went on and how I felt. As a mod, you should have also seen what this post was flagged under.

I know you're close to Donnie but that is very un-mod like of you. I was going to ask you to hide the post because you were online but this is what I see. Very, very disappointed and hurt, ngl.

@corvus albus can you please hide the post above? Thank you.

 

Fight, fight, fight, fight, fight, fight, fight!

Link to post
Share on other sites

  • BenG changed the title to All things : Magic Keyboard with Touch ID
Pop-a-911-ster 38,687

Apple is now selling the Magic Keyboards with Touch ID. Silver is the only colour available for standalone purchases, however, instead of the colourful ones that ship with the M1 iMac. Touch ID is only compatible with Macs with Apple silicon.

spacer.png

  • Magic Keyboard with Touch ID and Numeric Keypad - $179
  • Magic Keyboard with Touch ID - $149
  • Magic Keyboard - $99
  • *Magic Keyboard with Numeric Keypad - $129

*Not updated with the latest design, so the corner buttons are not rounded.

The Magic Trackpad and Magic Mouse have also been updated with the latest design, and are selling at the usual price of $129 and $79, respectively.

All Magic accessories now ship with the woven USB-C to Lightning Cable.

spacer.png

Edited by BenG
Link to post
Share on other sites

Pop-a-911-ster 38,687
On 8/8/2021 at 7:27 AM, Nightwing said:

What are your thoughts on Apple wanting to roll out the surveillance of CSAM on people's iClouds? @BenG

https://www.apple.com/child-safety/

Slippery slope but not sure why Apple alone is getting so much flak. Other companies like Google and Adobe have already implemented it for some time now. Maybe because they do it secretly?

I can see where they’re coming from though. Companies are protecting themselves because they don’t want illegal images on their servers. Also, users are not forced to use cloud storage and if they do, this is just part of the T&C. You are not affected if you do not use iCloud Photos.

Edited by BenG
Link to post
Share on other sites

A Popster Is Born 17,072
On 8/8/2021 at 4:10 PM, Nightwing said:

I think it’s also the fact that Apple prides itself on being “privacy protectors” so this is kind of odd for them to be doing even if this is already standard stuff.

Sure, but I do believe they were kinda forced to do it. 

Link to post
Share on other sites

A Popster Is Born 17,072
On 8/8/2021 at 7:49 PM, Nightwing said:

How so? I mean I guess in the way that other major tech companies already do this for their cloud services as well 

Yes, I think so. I know they’re basically one of if not the biggest player on the market but I’ve watched a vlog from a super popular person on YouTube in my country where he went on and on about the fact that Apple maybe did not want to do it but they were pushed somehow. 

  • paws up 2
Link to post
Share on other sites

Pop-a-911-ster 38,687

The implementation is particularly interesting. The analysis is done on-device instead of on the cloud, which would have been much simpler. Apple has the decryption keys to iCloud data. 

Link to post
Share on other sites

Pop-a-911-ster 38,687
On 8/8/2021 at 8:46 PM, Nightwing said:

What do you think is the reason for that? Because that does seem to be a weird approach. 

Laying the groundwork for end-to-end encrypted iCloud data :darkmode:

We can only wish :mess:

  • lol 1
Link to post
Share on other sites

On 8/8/2021 at 1:27 AM, Nightwing said:

What are your thoughts on Apple wanting to roll out the surveillance of CSAM on people's iClouds? @BenG

https://www.apple.com/child-safety/

Some of the language in the announcement seems a bit weird. They mention a 1 in a trillion chance of their threshold messing up, but one in a trillion what? Photos? Maybe I'm misreading but it's not very clear. 

 

Also not sure how I feel about them warning parents about explicit material. Should teenagers be sending explicit photos? No, but it does happen and I think alerting parents about it is kind of meh. It's gonna create more surveillance for overbearing parents and it's not like the kids will stop sending the photos, they'll just go elsewhere (possibly somewhere unsafe) in order to do it. 

 

What is considered a "known" CSAM image? Does some legal authority feed Apple AI child porn images in order to train it, creating a big database of child porn? And depending on how they source it, is it possible somebody has a picture of themselves from years ago in their iCloud that will eventually match and flag them? 

 

I'm not the most literate when it comes to computer science so I don't fully understand it, but I think they should have written things a bit more simply and concise. And I'm somewhat weary of some of the implications but I also don't think Apple would risk it's #1 marketing point in order to do this. 

Link to post
Share on other sites

Pop-a-911-ster 38,687
On 8/9/2021 at 2:23 PM, justhislife said:

Some of the language in the announcement seems a bit weird. They mention a 1 in a trillion chance of their threshold messing up, but one in a trillion what? Photos? Maybe I'm misreading but it's not very clear. 

Chance of an account being wrongly flagged, i.e. reaching the threshold incorrectly.

On 8/9/2021 at 2:23 PM, justhislife said:

What is considered a "known" CSAM image? Does some legal authority feed Apple AI child porn images in order to train it, creating a big database of child porn? And depending on how they source it, is it possible somebody has a picture of themselves from years ago in their iCloud that will eventually match and flag them? 

No machine learning is involved, i.e. no algorithm is trained using a database of child porn to look out for what would be classified as child porn. 

The database is provided by other organisations and Apple does not even look at the raw image. They just do hash-matching. A hash is like a “key” attached to an image.

So, no. Your private photos will not be flagged because they wouldn’t exist in the child porn database in the first place to match. 

  • paws up 1
Link to post
Share on other sites

Pop-a-911-ster 38,687
On 8/9/2021 at 3:17 PM, Nightwing said:

It only applies to the Messages app and I don’t think the majority of people are sending nudes through Messages. It does seem to be Apple overstepping its boundaries a bit though.

It’s an opt-in feature though

Link to post
Share on other sites

Pop-a-911-ster 38,687
On 8/9/2021 at 4:20 PM, Nightwing said:

Do kids that young even send nudes? Also I assume most don’t even have a phone and if they do, sure those parental controls should be in place but overall I think it’s an invasion of privacy

I think there’s a decent chunk of market. The Apple Watch can also be set up in “child mode”, starting last year  

I’m assuming the new feature is to protect children from predators that are sending explicit images. 

  • paws up 1
Link to post
Share on other sites

On 8/9/2021 at 8:56 AM, BenG said:

Chance of an account being wrongly flagged, i.e. reaching the threshold incorrectly.

No machine learning is involved, i.e. no algorithm is trained using a database of child porn to look out for what would be classified as child porn. 

The database is provided by other organisations and Apple does not even look at the raw image. They just do hash-matching. A hash is like a “key” attached to an image.

So, no. Your private photos will not be flagged because they wouldn’t exist in the child porn database in the first place to match. 

My question is though where do other organizations source their images? If your image makes its way to a predator unknowingly and they feed that hash to Apple, then what? What stops that from happening? 

Link to post
Share on other sites

Pop-a-911-ster 38,687
On 8/10/2021 at 8:16 AM, justhislife said:

My question is though where do other organizations source their images? If your image makes its way to a predator unknowingly and they feed that hash to Apple, then what? What stops that from happening? 

"the system performs on-device matching using a database of known CSAM image hashes provided by [National Center for Missing and Exploited Children] and other child safety organizations"

NCMEC's database consists on known illegal images and videos.

Link to post
Share on other sites

i think the whole scanning ur photos thing is an invasion of privacy but i think it's worth it in a sense, if it helps reduce CP being spread across the internet. of course it can and most likely will be used to hack and search people's images...

Link to post
Share on other sites

  • 2 weeks later...
Pop-a-911-ster 38,687
On 8/20/2021 at 4:57 PM, Delusional said:

Just collected my iMac that I bought for just above £50 :icant:

it’s a 2011 model but it’s in amazing condition, so happy whew. Came with authentic Apple keyboard and mouse too :giveup:

I’ll probably keep it for a bit and then maybe upgrade to something new in 1-2 years. 

Upgrade the HDD to SSD and upgrade the RAM to 16GB if it's not already that :crowned:

  • paws up 1
Link to post
Share on other sites

Pop-a-911-ster 38,687
On 8/20/2021 at 5:04 PM, Delusional said:

Nnnnn looking at the Apple website both look so difficult to do - is it really worth it?

you'll have to pull the display out :icant:

Link to post
Share on other sites

  • BenG changed the title to All things : New iPad and iPad Pro released

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share


×
×
  • Create New...