We spend a lot of time thinking about what to post on Facebook. Should you argue that political point your high school friend made? Do your friends really want to see yet another photo of your cat (or baby)?

Most of us have, at one time or another, started writing something and then, probably wisely, changed our minds.

Unfortunately, the code in your browser that powers Facebook still knows what you typed — even if you decide not to publish it. It turns out that the things you explicitly choose not to share aren't entirely private.

Facebook calls these unposted thoughts "self-censorship." Insights into how the social network giant collects these nonposts were recently revealed by two Facebookers. The study, conducted by Facebook data scientist Adam Kramer and summer software engineer intern Sauvik Das, shows how Facebook monitors our unshared thoughts and what it thinks about them.

The duo examined self-censorship behavior by 5 million English-speaking Facebook users, including their aborted status updates, posts on other people's timelines and comments on others' posts. To collect the text you type, Facebook sends code to your browser. That code automatically analyzes what you type into any text box and reports metadata back to Facebook.

Storing text as you type isn't uncommon on other websites. For example, if you use Gmail, your draft messages are automatically saved as you type them. Even if you close the browser without saving, you can usually find a (nearly) complete copy of the e-mail you were typing in your Drafts folder.

Facebook is using essentially the same technology here. The difference is that Google is saving your messages to help you. Facebook users don't expect their unposted thoughts to be collected, nor do they benefit from it.

It is not clear to the average reader how this data collection is covered by Facebook's privacy policy. In Facebook's Data Use Policy, under a section called "Information we receive and how it is used," it's made clear that the company collects information you choose to share or when you "view or otherwise interact with things."

But nothing suggests that it collects content you explicitly don't share. Typing and deleting text in a box could be considered a type of interaction, but very few of us would expect that data to be saved. When Facebook was contacted for this story, a representative said the company believes this self-censorship is a type of interaction covered by the policy.

In their study, Das and Kramer claim to only send back information to Facebook that indicates whether you self-censored, not what you typed. The Facebook rep agreed that the company isn't collecting the text of self-censored posts. But it's certainly technologically possible, and it's clear that Facebook is interested in the content of your self-censored posts.

Das and Kramer's study closes with the following: "we have arrived at a better understanding of how and where self-censorship manifests on social media; next, we will need to better understand what and why."

This implies that Facebook wants to know what you are typing in order to understand it. The same code Facebook uses to check for self-censorship can tell the company what you typed, so the technology exists to collect that data it wants right now.