Another Response to Pen Test Partners - Part 1

Another Response to Pen Test Partners - Part 1

As anyone who follows the project on Twitter or read my previous rebuttal, Pen Test Partners has been fairly active in researching various IoD devices on their own. As can be read here, here, here, and here.

I've previously posted my rebuttal to their very technically interesting, but highly juvenile reverse engineering and analysis of the Siime Eye Vibrator. Since then they've done other research that, in my opinion, suffered from a number of errors, inaccuracies, and ignorance that I wish to address. Some are technical, others are insights from my year or so of dealing with vendors directly and the industry.

Interestingly enough, even though I may have issues with their wirk, they are proving that great minds think alike (no ego there). By pursuing the exact same line of research as I am, at the same time, without either of our knowledge, they show that the tech needs this work. It's almost becoming comical how we are overlapping our efforts. It shows that the tech is at a point where security and privacy are now a concern needing of research.

This response and others that follow should not be considered anything personal against PenTest Partners and there is no hard feelings. Any issues are simply professional disagreements. In fact, I chatted with Ken Munroe at length on Skype and email and have been in regular communication with him ensuring we are getting on the same page. We both want to prevent future duplication of efforts and ensure accuracy. We both agree there is certainly room for collaboration in the future. Hopefully we can set the record straight on the state of IoD tech so everyone is working with the same facts.

In the mean time, I have a few bones to pick with the past.


Regarding the July 24th, 2017 blog post entitled Adult IoT toys. Privacy invasion or worse?, there are some technical inaccuracies and false assertions to be dealt with.

Besides, these are very similar security flaws to those we find in other areas of IoT. In many cases, the devices are just a layer of latex over a PCB/battery/motor.

I'll start off by saying that Ken's post does a better job than previous at staying level headed and avoiding juvenile humor. He also makes the point that the issues being dealt with in IoD are the same as everyone else is dealing with in the larger IoT field, just in a different, and more intimate package.

Mobile app permissions have often been excessive too; in one case we found a sex toy app had the permission to make calls. Why?

First troublesome assertion. While there are lots of apps I've found with unnecessary, questionable and even scary permissions (WRITE_SECURE_SETTINGS was the scariest so far), saying something like this is problematic. Which permission was it? Was it ever called (used) within the app? Many permissions related to calls or phone usage are needed to access other information about the phone (such as automatically silencing incoming calls) that have justification (who wants to be interrupted mid, ummm, session).

As well, some apps are several years old (a common issue). These apps target older versions of Android that do not use the newest and most granular permission set. As a result, they can seem to be asking for excessive permissions now when that was all that was available at the time. Other times though, the usage of excessive permissions was due to lazy coding (just allow all) or part of debugging that never got turned off. It does happen.

The problem with Ken's statement is that it asks the question "Why?", but without providing an answer or any evidence it was reported and the vendor asked that question. This leaves the question in the air and leads the reader to wonder. This allows for all sorts of conjecture and suspicion of maleficence when there may be none. It's unhelpful to speculate until all the facts are in.

I would invite Ken to clarify which app and which permission he is referring to and if the permission is ever exercised in the app. I can provide independent analysis if needed. Sometimes all you need to do is dig a little more with another set of eyes.

One aspect of the security guidelines we're working on is to regularly review the permissions for unneeded or excessive permissions. Additionally, documentation and explanation of why permissions are needed is essential in being transparent with users. Frankly all apps should do this, and some already do.

Gathering location data can be quite a sensitive matter in relation to adult toys. Does the vendor really need this? It’s reasonable to gather usage patterns for product improvement, but not if that can be attributed to an individual users location. We-Vibe made this mistake and others in their mobile app and paid a heavy price, yet the latest version of their We-Connect app (2.5.3) still collects approximate location.

This one pissed me off a lot. It's just poor research standards, or more correctly, a lack of curiosity and research.

Every app on Android that uses Wifi or Bluetooth and needs to look for a device asks for the location permission. I had seen this behavior myself when I started and was concerned, especially because of the implications for IoD device users. When I first noticed the request, there was no obvious reason or explanation. I chose to do my due diligence and I spent a great deal of time trying to get an answer to this question. I went as far as to query a contact I have at Android Security to get the answer and even they had to dig. I went straight to the top to get an answer before I said anything incorrect.

There is a larger article coming with more detail about this, but in short; Since Android 6.0
Bluetooth connections have required the location permission in order to scan the local area for what devices are nearby

If you read the Android developer documents regarding Bluetooth Permissions and Bluetooth Low Energy, you would see that Bluetooth use by an app requires
the ACCESS_COARSE_LOCATION permission at a minimum in order to function. Specifically the startScan function needed to search nearby for the very device the app needs to connect to. Seems kind of necessary for the functioning of the app and device.

Now, I am not a coder. I have no reason to have known this otherwise. It took me a couple Google searches to find references that it was necessary for the Wifi and Bluetooth connections. Finding out why, that took some time however. It has to do with using WiFi and Bluetooth beacons for location sensing indoors where GPS is not available.

For whatever reason, PenTest Partners did not spend a bit of time researching with Google and Android developer documentation to understand why We-Connect and nearly every IoD app requests ACCESS_COARSE_LOCATION (needing FINE location is another matter however). As a result, the article spreads Fear, Uncertainty and Doubt about We-Vibe asking for this permission when the real reasoning is much more mundane and dictated by the platform and not the vendor.

While there are some apps as yet fully analysed that do collect location data beyond what Bluetooth connection requires, one should not condemn a vendor for doing exactly what the platform requires in order to use the functionality they require. If they use it beyond that though, then there are legit questions to be asked.

That said, PenTest Partners missed a chance to learn something very interesting in the process and is worth discussing.

For reference, version 2.5.3 of the We-Connect app (that they cite as suspicious for asking for location permissions) does query for the users location elsewhere in the code outside of the bluetooth connection. Specifically, it queries the user location in the ClassActionService.java file of the android app. It then uses that location in queries it sends to vendor operated web servers.

While if you stop right there, you would think: "Aha! Gotcha! They were collecting users locations in some malevolent scheme!"

However, if you dig just a little bit, you find the truth is far from what you are thinking and while interesting, pretty mundane.

Those pages called in ClassActionService.java are 1 and 2. If the filename wasn't a dead giveaway, the user location is used to determine the country the user is in as part of the class action lawsuit. This is the user notification that was court ordered as part of the aforementioned law suit they paid dearly ($5 Million CAN) for in order to notify users they may be able to make a claim.

While I have no proof they were not doing anything nefarious with the location data once received by the back end server, one can safely assume that they wouldn't try anything sneaky with a function related to a class action lawsuit they lost.

Again, raising the suspicion without any evidence of investigation is the opposite of helping.


By far the most common security issue we found with adult toys was to do with pairing.

There is a whole article planned on the issue of local, direct connect attacks but again, I shall summarise.

Bluetooth has always had issues. Weak, predictable, or default PIN's. Non-changeable names and many others.

I've spent my energy focused on the mobile apps, API's and other back end systems because the size of the attack surface is greater and the impact far greater than a local attack. I've always known Bluetooth was an attack surface, but it was a decision about allocation of resources. My opinion on the matter is that if someone is attacking the device directly, they need to be within about 6-10 feet of you (efforts are under way to establish this scientifically). Vendor and Pen Test Partner assertions of 30 feet are ideal, line of sight bench tests. When worn, devices are surrounded by a meaty bag of water that attenuates the signal. In real world terms that means the attacker is within 6-10 feet of you, so they are inside your home and you have bigger problems to worry about. If you choose to wear it in public (as some people choose to do for their own reasons), an attacker would need to be within 6-10 feet unless they had a high power directional antenna (which is kind of obvious and hard to hide (Is that a yagi in your pocket?)), which means that they would be within punching range should they make it obvious they are the one laughing at your sudden surprise at being vibed.

Some devices are designed and marketed for users who enjoy using them in public and yes there are risks in doing so. Static device names, known command sets and an opportunity can lead to someone hijacking the device. However, if you consider the vendor approved method of use, it means that the radio is surrounded by a large bag of meat and water which is very good at blocking RF energy in weird and variable patterns. So for a drive-by (walk-by?) attack, the window is very small. An attacker would have to be well prepared, highly mobile, and very lucky to find a target of opportunity. A possible, but very unlikely scenario.

(As a side note, a great many events one can guess that wearable IoD device use would be prevalent, cameras and tech use is most often strictly forbidden. As well, the communities at these events are especially sensitive to consent and wouldn't take kindly to such shenanigans and are likely well equipped with instruments specially built to beat any violators senseless.)

More testing and quantitative data collection is planned, along with more research into the practicality of direct connect attacks. Suffice it to say, risks posed by poor app, API, and back end system security can be a much greater threat. While you can hack one person standing next to you, I can hack 50,000 around the world at once. Time and effort is better spent on the latter.

As there are usually few if any sensors on the toy, compromise simply isn’t that serious.

Statements like this show a lack of understanding of the market and what's available. It's far more complex than you think and its fascinating the tech in some devices.

Many Kegel trainers have very sensitive pressure sensors for feedback. Devices like the Ohmibod Fuse and the Kiiroo Pearl 2 have a number of sensors for bi-directional Teledildonics. So to say that there are few if any with sensors shows a lack of understanding of the market and the devices available.

In fairness, Lovense released a much more secure version of the app called ‘Lovense Wearables’ though oddly they haven’t pushed all users to it

My year of involvement with the industry and vendors has provided some insight into the ebb and flow of things and insight to the why of so much.

First, you need to understand the apps mentioned.

  • Lovense Wearables was the retail remote control software. It was available in the app store and was the primary software used by most users.

  • Lovense Bodychat was the app that was meant to be utilised by Cam-Models for interacting with various Cam sites systems. It was not meant for normal retail customer use.

So there are a number of issues here. Replacing Bodychat with Wearables was not a solution as they were different apps with different functions and purposes. It's apples and oranges in a way.

Second, pushing out an update to either was problematic since Google and Apple have both routinely removed both apps from their respective stores on numerous occasions for violations of rules that no one can seem to fully explain (That's a whole other article). At one point Google even deleted Lovense's Google Play developer account because of too many violations. These actions broke the ability to update and to push new, patched versions of their apps. This forced Lovense to require customers to side-load the apps in order to use them which involves turning off some security features. Of course these side loaded apps did not receive automated updates either since they were independent of the App store.

Notice I've used past tense when speaking of these apps. That is because they are both deprecated and replaced by new apps, Lovense Remote for retail, and Lovense Connect for Cam model usage.

Remote was the v3.0 rename of the Wearables app, and presuming that the version you had was obtained after the last Google Ban and was not a side load, users would have received it as part of updates.

Connect was the rebirth of the Bodychat app under a new developer account. Anyone using the old Bodychat app from Google Play or as a side load could not easily be transitioned due to Google's arbitrary decisions on app approvals. This led to a large amount of customers, who rely on these devices for income and may not be the most technically savvy, left using the old version with it's various insecurities.

Had Lovense upgraded to the new and more secure version and cut off API access to the old one, there would be a huge backlash. This leaves Lovense in a tough position of having to support a legacy system because there is no easy transition method. To Lovense's credit, their apps now do a check and alert users that a new version is available. Presumably this is outside the app store which means should the be banned again, they have a way to alert users to new versions and the needs to upgrade.

In addition, I suspect but have yet to confirm (that's another phase of the project), that some Cam sites wrote their own interfaces to the bodychat app and for reasons of their own, have not been able or willing to transition. This lack of standardization is because of Google (and Apple's) decisions to break the ecosystem that developers expected to be there when they designed these systems originally.

There are other smaller issues at play, however the issue is far from as cut and dry as Lovense not caring as the article made it appear. Here is a company dealing with some very major issues beyond their control, while still trying to stay in business and keep their product operating and customers happy.


Conclusions

While it was nice to get the shout out at the end of the article and several points were conceded about the impact, the rest of the article undermines the efforts of the IoD project and casts a disingenuous light on the industry because of poor research and lack of understanding.

I fully admit, when I started this research, I had no idea this industry was so vibrant and active and how many players and factors were involved. I've made assumptions about vulnerabilities being due to lack of caring, but found that it was due to a genuine lack of understanding and naivete after talking with the vendors personally. They are new to this and working to catch up on 15 years of lessons learned.

Fortunately, Ken Munroe and PenTest Partners are civil people and have taken my critiques in stride. I still have more critiques to post, however we've spoken at length and will be finding ways to work together. We both want to ensure that our work is true and correct and that any debate or discussion is entirely factual.

It's nice to work with professionals. Hope it's nice to work with me.