ChatGPT

Overview of Cluster’s Content Policies and “Grey Zone” Dynamics

Cluster is a Japanese social VR platform that lets users create and share custom avatars (VRM) and worlds/events. Its official policies emphasize free expression and creativity, but also ban explicit harassment or illegal content. For example, the Terms of Service expressly forbid “content related to criminal acts or contrary to public order and morality”help.cluster.mu, and require that copyrighted audio/video be licensed. The Content Guideline (for items sold in the Cluster store) similarly bars “Representations of or suggestive of sexual acts”, nudity, excessive exposure, child porn, and extreme violence or gorehelp.cluster.muhelp.cluster.mu. In practice, Cluster relies heavily on user self-policing: its guidelines state that if a world or avatar is technically “within the bounds of the Terms”, the company will not proactively remove it – only if other users find it “troubling or offensive” and report it will Cluster consider penaltieshelp.cluster.mu. In other words, most content is allowed by default unless someone complains.

Within these broad rules, Cluster imposes few technical barriers on uploads. Users can freely upload VRM avatars (up to 100 files) as long as they meet basic size/bone limitshelp.cluster.mu. World creation via Unity or the in-app World Craft tool is similarly open. The only explicit limitations on uploads are on file size and complexity – for example, VRM files must meet Unity’s humanoid rig requirements and stay under polygon/texture limitshelp.cluster.mu. Cluster’s FAQ even notes that it has only “internal restrictions that encourage common sense usage” of avatars, and that “generally, you can upload your avatar without worrying about the restrictions.”help.cluster.mu. This reflects Cluster’s trust in users to abide by the basic rules themselves. At the same time, Cluster publishes a detailed Community Guideline forbidding various forms of harassment (e.g. “stalking,” “molesting an avatar,” hate speech, etc.)help.cluster.muhelp.cluster.mu and stressing that abusive or exclusionary behavior (such as attempts to forcibly remove users without just cause) is prohibitedhelp.cluster.mu. The guideline specifically warns users not to eject or block others unjustly: “Attempts to remove certain users from worlds or events (without justification based on indicated rules)” are listed as harassing behaviorhelp.cluster.mu. Finally, Cluster’s IP rules forbid using copyrighted game or character images as avatars without permissionhelp.cluster.mu, and require that any music played (e.g. at events) be rights-approved (events can apply via “Cluster perform,” but actual master licenses are not granted)help.cluster.mu. Taken together, Cluster’s official policies allow broad freedom for user-created VR content, while banning clear-cut illegal or extreme materialhelp.cluster.muhelp.cluster.mu.

Platform Growth and Content Openness

Cluster’s user base has grown rapidly as an event-driven VR platform. For example, a 2024 press release touts “cumulative 35 million event attendees” on Cluster (making it “one of the largest metaverse platforms in Japan”)via.ritzau.dk. Similarly, Cluster’s app has exceeded millions of downloads since launch. This growth coincides with Cluster’s open-content model: by permitting users to upload virtually any non-criminal VRM avatars and worlds, the platform encourages a wide variety of experiences. In practice, many niche or fan-driven events (e.g. anime concerts, live DJ sets, educational conferences) thrive on Cluster precisely because users can share custom avatars, worlds, and media. The platform’s “no policing unless reported” stancehelp.cluster.mu means content tends to be more permissive than in tightly moderated spaces. In general, researchers note that social VR platforms which maximize user creativity and freedom—while managing harassment—tend to attract larger, active communities. Cluster’s emphasis on “openness and freedom” and creativityhelp.cluster.mu seems aligned with this: by reducing upfront filtering, Cluster lowers barriers for users and content creators, which likely aids its adoption.

At the same time, this permissive approach creates “grey-area” content: material that is legal but perhaps “profoundly violate[s] people’s sense of decency, morality, or justice”techpolicy.press. In VR these cases become salient: academic studies describe VR experiences (e.g. harassment or hateful speech in virtual worlds) as especially intense due to immersiontechpolicy.press. Cluster’s own policy implicitly recognizes this: it encourages users to be sensitive to diversity and not assume all content is appropriate for public spaceshelp.cluster.mu. For example, the Community Guideline notes that a graphic or sexual avatar “can cause discomfort, distress, or even fear,” and urges users to consider the mixed audience (including minors) in public worldshelp.cluster.mu. However, as long as such content does not clearly break a rule, Cluster leaves moderation to user reports. This mixture of broad permission plus reactive policing means “lawful but awful” content can circulate freely until someone objects.

Community Policing and Exclusion Tactics

Given the ambiguous boundaries, some community members take it upon themselves to judge and exclude others. Anecdotal reports from Cluster users indicate that event organizers or influential attendees sometimes accuse others of being “illegal” or “unacceptable” based on subjective interpretations of the rules, and then kick or shun them from events. For instance, if an avatar or world uses unlicensed copyrighted material (like game character models or music), some hosts will loudly label the participant’s presence as “違法 (illegal)” and bar them from the room. Similarly, users have described seeing notices or hearing moderators say things like “このイベントは禁止です (this event is forbidden)” or “お前は出禁だ (you are banned from entry)” when they violate a site-specific rule. These actions effectively use Cluster’s vague “public order and morality” clause as a lever to exclude individuals.

Importantly, Cluster’s official rules disapprove of such vigilante exclusions. The Community Guideline explicitly lists “attempts to remove certain users from worlds or events (without justification)” as prohibited harassmenthelp.cluster.mu, and it warns that even admonishing others can be a violation of the guidelines on its ownhelp.cluster.mu. In practice, however, many worlds allow the host to kick or mute participants, so organizers can exercise control. When combined with strong rhetoric (“illegal,” “banned”), this yields a powerful social tool. In effect, some groups exploit the “grey zone” by applying moral pressure rather than formal rules. We could illustrate this with a hypothetical dialogue: a host might admonish a user, “Your avatar is not authorized – this is illegal activity! You can’t be here,” even if the user’s actions technically fall within Cluster’s broad Terms. Such exchanges show how community norms, not just platform policy, determine participation.

Reporting and Moderation Process

For addressing violations, Cluster provides an in-app reporting mechanism. Users can report individuals or whole worlds/events by clicking the menu (triple-dot) on a profile or event pagehelp.cluster.muhelp.cluster.mu. The Help Center describes exactly how to file a report from the Cluster app or website, but gives no detail on what happens afterward. There is no public log or transparency portal showing which reports were acted on. In practice, enforcement seems largely reactive: if enough users report an offending avatar or world (or if intellectual-property holders complain), Cluster staff will review it. Penalties range from feature bans to temporary account suspensionshelp.cluster.mu. In its Community Guideline Cluster notes that proven guideline violations can lead to temporary or permanent bans, and in severe cases the company may even pursue legal remedieshelp.cluster.mu.

However, from a user perspective the moderation process is a “black box.” Cluster does not publish clear statistics on reports or appeals, nor does it explain decisions to the community. This opacity can embolden users who distrust the official process: if someone feels a reported offense is not being punished, they may take matters into their own hands by publicly shaming or exiling the violator. Conversely, if someone believes a complaint is frivolous, they may appeal informally within the group. In sum, reporting exists, but the platform’s actual response is neither visible nor easily understood by ordinary users.

Community Self-Governance

Cluster’s culture emphasizes peer respect and caution in enforcement. The official guidelines advise users not to publicly scold others for minor infractions: “chiding other users for a violation…can be considered a violation on your part”help.cluster.mu. Instead, it urges members to discreetly point newcomers to the published rules or to simply report issues to Cluster staffhelp.cluster.mu. The recommended user tools are actions like muting, blocking, or moving to a different spacehelp.cluster.mu, rather than shaming. This reflects an ideal of mutual support: users should “praise one another” and respect diversityhelp.cluster.mu, using the platform’s built-in safety features.

In reality, however, clusters of veteran users often form tight-knit event communities (“circles”) with their own social norms. Experienced hosts may rigorously enforce their version of the rules, sometimes exceeding Cluster’s written policies. For example, an anime fan event’s organizers might insist on pre-approval of all avatars and eject anyone wearing an unlicensed character model, effectively policing copyright as if it were enforced by law. Similarly, some groups have reportedly used reporting links in chat as a threat: “If you don’t leave, I’ll report you for [X] violation.” These dynamics create a subculture where community norms govern behavior as much as platform rules do.

From a critical perspective, this self-policing culture has pros and cons. On one hand, it can help maintain order when official moderation is slow or unclear: vigilant users catch real problems (harassment, hate speech) more quickly than a remote moderator could. On the other hand, it can lead to abuse of authority: members with no formal role might expel others for petty or even imagined infractions, turning ambiguity in the rules into social weapons. Cluster’s guidelines implicitly acknowledge this tension by discouraging public shaminghelp.cluster.mu, but the advice relies on users’ goodwill.

Summary

In summary, Cluster’s open platform—allowing users to upload custom VRM avatars and worlds with few upfront checks—naturally creates a “grey zone” where content sits between official permissibility and community standards. Officially, only clearly illegal or extremely offensive material is bannedhelp.cluster.muhelp.cluster.mu; everything else is largely left to user judgment. The platform’s growth suggests that this freedom attracts many creators and participants, but it also shifts moderation burden onto the community. Some users or event hosts exploit the vagueness of the rules to label others as “違法” (illegal) or “禁止” (forbidden) as a means of exclusion, even when Cluster’s own rules would not automatically mandate such punishmenthelp.cluster.muhelp.cluster.mu. Cluster provides reporting tools to address disputeshelp.cluster.mu, but with little transparency on outcomes. Thus the safety and culture of Cluster’s virtual spaces depend heavily on how community members choose to enforce norms. As one review notes, VR platforms must balance freedom of expression with immersive safetytechpolicy.press – a challenge Cluster navigates by trusting its users’ “common sense” and by encouraging them to moderate each other via the reporting systemhelp.cluster.muhelp.cluster.mu.

Sources: Cluster’s official documentation (Terms of Service, Content/Community Guidelines) provides the above policy detailshelp.cluster.muhelp.cluster.muhelp.cluster.muhelp.cluster.mu. Cluster’s own announcements report user/adoption statisticsvia.ritzau.dk. The concept of “lawful but awful” VR content is discussed in the literaturetechpolicy.press. All quotes and guidelines are drawn directly from these sources.

Terms of Service – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/20264186157337-Terms-of-Service

Cluster Content Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/8939422427289-Cluster-Content-Guideline

Cluster Content Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/8939422427289-Cluster-Content-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Limitations to custom avatars – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/360029465811-Limitations-to-custom-avatars

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

CLUSTER | Business Wire

https://via.ritzau.dk/pressemeddelelse/13837867/cluster?publisherId=90456

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Prevention and Management of Lawful but Awful Content Moderation in XR Platforms | TechPolicy.Press

https://www.techpolicy.press/prevention-and-management-of-lawful-but-awful-content-moderation-in-xr-platforms/

Prevention and Management of Lawful but Awful Content Moderation in XR Platforms | TechPolicy.Press

https://www.techpolicy.press/prevention-and-management-of-lawful-but-awful-content-moderation-in-xr-platforms/

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Reporting – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/5229937590425-Reporting

Reporting – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/5229937590425-Reporting

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline

Cluster Community Guideline – Help Center | Cluster

https://help.cluster.mu/hc/en-us/articles/18396231289625-Cluster-Community-Guideline