Enter your email address below and subscribe to our newsletter

Public-space screens: to whom—and how far—are swimsuit and pornographic images designed to be visible?

© Synthesis

What, if it appeared on someone else’s tablet, would bother you?

In November, a passenger photographed U.S. Democratic Representative Brad Sherman viewing images of women in underwear on a flight. The photo was posted to the X account “Dear White Staffers,” a widely followed anonymous account about U.S. political staffers, and it spread.

The “visibility plane” is how a screen is designed to be visible—who can see it, and from how far. In its narrow sense, this is about physical visibility. In a broader sense, it also covers what is displayed and how it is arranged and operated in public settings. Seen through the lens of sexual wellness—understood as bodily, mental, and relational well-being around sexuality—our acts of “seeing” and “showing” now depend heavily on design and operation.

Is visibility in public spaces a matter of design or a matter of manners?

Images with high exposure, such as pornographic content and swimsuit or lingerie photo spreads (similar to Japanese “gravure” in magazines), can affect not only the device owner but also how it feels bodily for nearby people.

The stress, flashbacks, and bodily tension when people are suddenly shown unwanted images can amount to a significant psychological and physical burden. By published manufacturer specifications, some physical filters begin blocking at roughly 30° to the side (Source: 3M), while others leave only a narrow “visible range” of about 70° (Source: HP Sure View).

Viewing distance and angles are largely set in advance by device and service design and by organizational choices of which devices/filters to adopt, before any user action. As a result, the boundary between what is shown and what is not is preset in the default state where the user changes nothing, prior to individual judgment.

Modes and rules can be changed later, but the more basic question is how far visibility is designed in the first place. With the adjustable elements on the design and operational side identified, the remaining aspects can be reconsidered as matters of manners—such as which apps to open in public spaces and in which situations to expose a screen in front of others.

This way of designing what is visible works similarly on commuter trains and in cafés, and should be treated as a prior design condition, apart from arguments about individual restraint.

Recommendations aren’t random: how display rules shape what appears in public spaces

Young woman using a smartphone in a crowded train, representing daily exposure to personal screen content in commuter environments.
© Synthesis

Beyond the screen’s sightlines, “visibility” as what is displayed is also shaped by design and operation.

Representative Sherman cited X’s “For You,” but recommendation systems—mechanisms that infer preferences from browsing history and automatically determine the order of items shown—are not accidental. Current algorithms operate regardless of whether the context is a public space, yet when carried into it, they strongly steer what appears on screens there.

To reduce incidental exposure to sensitive images that can shock children or survivors, such as sexual or violent content, implementing small rule changes—such as making a non-personalized chronological feed a one-tap initial option and providing a “low-exposure mode” for public spaces—can be expected to have some effect.

In Europe, the EU Digital Services Act (DSA), a regulatory framework for large online platforms, has entered into force, and enforcement has begun. The framework aims to protect minors and prevent the spread of illegal content. In implementing the DSA, platforms are required to take measures such as making recommendation systems more transparent, offering non-personalized feeds, and expanding user options, which may also work to suppress recommended displays that tend to create unwanted visual exposure.

Screens in public spaces and labor asymmetries—who keeps “being exposed” to them

Empty airplane cabin with seat-back screens projecting visible light beams, illustrating screen visibility and exposure angles in enclosed public settings.
© Synthesis

A month earlier, in October, a senior official of the Japan Football Association was viewing child sexual abuse material (CSAM) in international business class during an overseas trip, and was witnessed by flight attendants before being detained in Paris during transit. He was charged with offenses including possession and importation of CSAM involving persons under 15 and was subsequently convicted in expedited proceedings in a French court.

Accessing or possessing CSAM is a grave crime in any setting and is primarily a matter to be addressed through criminal penalties as well as monitoring and reporting systems. In addition to the illegal act itself, there is also a structure that places flight attendants—who must keep patrolling the cabin and cannot simply walk away—in a position where they end up repeatedly witnessing such screens.

Information acquisition by peeking at screens reportedly succeeds in nearly 90% of cases even in experiments measuring how easily screens can be peeked at in office environments (Source: Ponemon Institute). For workers like flight attendants, who must keep circulating and cannot easily step away, this “peekability” translates into a higher likelihood of unwanted exposure.

Even if the environments differ, the structure in which screens remain visible from multiple angles is similar in stores and on aircraft. Workers carry that stress day to day while being required to maintain smiles and demeanor. Service-providing organizations need to review screen design—devices, placement, and usage rules—so that they can protect workers before those receiving the service, such as passengers.

Design choices at multiple layers—devices, platforms, and service providers—together shape a multifaceted visibility in public spaces. Device and filter design shape the physical question of how far screens can be hidden from sightlines. Recommendation algorithms and display modes steer what kinds of images tend to appear in the first place. And the design of placement and rules largely determines who keeps being exposed to those screens.

When encountering news that tends to be consumed as “who looked at what,” in addition to questioning the harmful acts themselves, directing attention to “to whom and how far it is designed to be visible” may make the news itself look a little different.

Avatar photo
Synthesis Editorial Team

Based in Tokyo, our editorial work explores the structure beneath narrative and draws on publishing and media experience. We follow how news and culture lean on stock frames and share ways of seeing the world a little differently.