In this essay, I will focus more on the phenomena of hypernudging than microtargeting, although the two concepts are arguably quite interlinked, and discuss whether hypernudging constitutes a threat to children’s autonomy in particular.
In sections (1) and (2) we discuss the concept of autonomy and in particular whether children have autonomy and their special status in society. In section (3) we define hypernudging and discuss some cases of hypernudging in children's technologies. In section (4) we discuss why hypernudging constitutes a threat to children's autonomy, and address some arguments to that case. In section (5) we discuss one of the cases which could mitigate this threat: increasing children's literacy in privacy in computing.
1. What is autonomy
Autonomy is a broad and well-researched term within philosophy, so for this essay, we will be focussing on personal autonomy as opposed to moral or political autonomy, since this is the most relevant in the case of hypernudging of an individual. We will focus on procedural accounts of autonomy, which are ones that consider a particular action autonomous if it's the result of critical reflection by the individual.
Antle and Kitson refer to (Antle & Kitson, 2021) a model for personal autonomy that Burr et al. describes by Rughiniş, Rughiniş & Matei (Rughiniş et al., 2015) which has five dimensions, two of which are of note here because of how hypernudging undermines them: degree of personalization and whether the technology promotes moral deliberation or values in the actions it recommends (Burr et al., 2020).
Christman develops a theory of autonomy based on how the values we develop in our lives impact the decisions that we make, meaning that for decisions to be considered autonomous the development of these values needs to be free from interference (Christman, 1991). For this essay, we will be using Christman's definition of autonomy, as it is particularly relevant when considering children since they are still very early on in the process of developing values and desires
2. Children and autonomy
In this section, we explore the idea of whether children have autonomy, their status as vulnerable citizens in society, and settle on a model within the dominant space of stage theory, The definition of what a child is does not particularly have a consensus among scholars, indeed even societally we see that across different countries there are different ages of “adult” activities such as driving or drinking alcohol. Ariès in his book Centuries of Childhood talked about medieval society not having an idea of “childhood”, although this has been countered by scholars across disciplines (Aries, 1965).
Aristotle posited the idea of children being an immature version of an adult human specimen, and this continues to be the Western view of children, even for those who haven’t particularly encountered the original idea. Stage theory builds off this idea to consider childhood as one or more stages of maturity in becoming an adult.
Rousseau offered five stages for a person to reach moral maturity (Rousseau, 1979), and Piaget also developed a stage theory of cognitive development that suggested that children move through four different stages of mental development, focussing on knowledge acquisition and the nature of intelligence in children (Piaget & Inhelder, 1974). Although widely adopted, stage theory also has its critics - Gareth Matthews argues that the Piagetian-type stage theory of development favours a “deficit conception” of childhood (i.e. children are considered in terms of missing adults capacities), which doesn’t hold up very well when we consider that there are some things that children are better at than adults, such as picking up new languages (Matthews, 2009).
A relevant version of stage theory when considering the autonomy of children is that of Hoffman, which describes four stages of developing empathetic feelings and responses, most notably deviating from other extensions of stage theory by allowing genuine moral feelings and agency to very small children (Hoffman, 1982).
For this essay, we will use the definition of a child given by Brighouse and Swift, which argues that a child will have some necessary features - vulnerability, dependence, no conception of value and the capacity to develop into independent adults (Brighouse & Swift, 2016).
Mullin argues that children’s capacity for autonomy can be considered as their self-governance in service of what they care about, and thus a child’s attachment to a parent or loved one can be a source of autonomy for them (Mullin, 2007). Societally we respect children’s agency in certain situations, such as consulting them around medical decisions, but this tends to only happen up to a certain point before decisions are deferred to a guardian.
Dworkin argues that “not all paternalistic acts involve interference with liberty”, citing the examples of a doctor lying to a terminally ill patient, or a psychiatrist informing an adolescent patient’s parents about their drug use (Dworkin, 1988). However, we tend to consider paternalistic acts less appropriate as children become older, culminating with them being seen as inappropriate by the time a person reaches adulthood.
Giesinger explores a two-level idea of vulnerability in children, showing that children are in more danger of being harmed because any harm they incur harms them immediately and on a developmental level, and also because they have less capability to avoid harm than adults do (Giesinger, 2019).
Ryan and Deci’s model for children’s well-being also includes autonomy - they build off self-determination theory and propose that autonomy, competence and relatedness are all necessary for human well-being and flourishing (Ryan & Deci, 2000).
For this essay, we will follow stage theory as a model of childhood, particularly Hoffman’s thinking in that small children can have moral feelings and agency, but also follow Giesinger’s conclusion of them being an especially vulnerable class, hence needing particular special protections. This is a generally accepted idea socially, as shown by there already existing movements that focus on protecting children in particular (as opposed to all people) from the negative influences of technology, such as the Youth Manifesto for a Better Internet (Childnet, 2015).
In this section, we discuss the concepts of nudging and hypernudging and then discuss three examples of hypernudging in children’s technology: purchases within video games, the smart Hello Barbie doll, and advertising in YouTube videos.
The philosopher Rainer Forst defines power as “the capacity of A to motivate B to think or do something that B would otherwise not have thought or done” (Forst, 2015). Tying into this idea, the term “nudge” was popularised by the book Nudge (Thaler & Sunstein, 2009), with “nudge theory” referring to the idea that positive reinforcement and indirect suggestion can be used as an effective way to influence the behaviour and decision making of individual people or groups.
Taking the idea of a “nudge”, Yeung coined the term “hypernudge” in reference to personalised nudges driven by Big Data (Yeung, 2017). Yeung calls hypernudging “nimble, unobtrusive and highly potent”, and Sætra takes this one step further to say that “With Big Data, nudging can become so effective that it is hard to withstand it, making the nudge more of a shove” (Sætra, 2019).
With the term hypernudging defined, it’s useful to see a few concrete examples of where this is already being implemented, particularly in technologies that children use.
The first is in the video games, where we see hypernudging applied very effectively to pressure users (both children and adults) to conduct microtransactions within the game. Techniques such as creating artificial scarcity, pay to win situations and tactically-timed spending prompts are common, and there are often a lack of safeguards that make it difficult for parents to limit their children’s spending (5Rights Foundation, n.d.-b).
Analysis determines that “some in-game purchasing systems could be characterized as unfair or exploitative”, with this due to “informational advantages” such as behaviour tracking to optimise players’ spending (King et al., 2019).
A second example is the controversial Hello Barbie doll released in 2015. The doll (now discontinued) would record and send anything said nearby to a company ToyTalk to analyse the conversation and provide a personalised response back (Gibbs, 2015). There was understandable concern over the acceptability of this mechanism, and as Steeves pointed out “The conversation accordingly has a direct beneﬁt to Mattel because, throughout the doll’s ongoing dialogue with the child, the corporation can steer the child’s emotions to make him/her more amenable to the appeal of its products” (Steeves, 2020).
Steeves also points out Mattel’s desire to move away from being seen as having dated ideas of women in society “Mattel’s utterances are sensitive to this historical and political context and attempt to defuse it. For example, the dataset includes frequent references to Barbie as a learner, who is especially interested in non-traditional ﬁelds for women like science”, but correctly points out where the doll still contains biases towards certain conversation “Barbie’s 8000 lines of dialogue contain no mention of Islam or Eid, even though there are 3.45 million Muslims in the United States”.
Although parents are given control over being able to delete the data, it is probably unreasonable to expect a parent to spend time regularly reviewing the stored data, and so effectively the only choice is to accept all the privacy issues in return to being able to use the doll - “free choice is illusory; choice only exists between using or not using the smart toy or its full array of functionalities” (Keymolen & Van der Hof, 2019).
A third example of hypernudging in children’s technologies is on YouTube. A recent Ofcom report found that 58% of children using YouTube every day (Ofcom, 2021), and many YouTube “influencers” have followings consisting of millions of people - “Many children and young adolescents admire these influential youths and aspire to attain their lifestyles” (De Jans et al., 2018).
Qualitative research showed that children have diﬃculties recognizing hidden and embedded advertising in YouTube videos (Martínez & Olsson, 2019), and De Jans et al showed that disclosure in an influencer marketing video led to higher recognition in children of the video being an advertisement (De Jans et al., 2018).
It should be noted that studies show mixed results on whether brand desire is impacted by disclosure for children, with some studies showing that disclosure can have a negative effect on brand preferences (An & Stern, 2011), and others showing that disclosure can have no effect (Vanwesenbeeck et al., 2017) or even positive effect (De Pauw et al., 2018).
4. Hypernudging as a threat to children’s autonomy
Having defined hypernudging and given some examples, in this section I will argue why hypernudging does constitute a threat to children’s autonomy, and a strong one at that. I will then address four arguments against this case: (1) that hypernudging could be used beneficially for children and their autonomy, (2) that children could provide consent for hypernudging, (3) that child-specific specialised platforms could be developed, and (4) that children should be exposed to hypernudging to prepare them for adult life.
Firstly, Yeung herself states that hypernudging is “distinctly manipulative, if not straightforwardly deceptive”, with this arising because the mechanism is “deliberately exploiting systematic cognitive weaknesses which pervade human decision-making to channel behaviour in directions preferred by the choice architect” (Yeung, 2017). We have no reason to assume that children are exempt from this statement since we are considering children to have decision-making skills, and also vulnerable, so likely to have more cognitive weakness than an adult.
Not only is hypernudging a threat to children’s autonomy, but it is also unethical - Smith and de Villiers-Botha state that “Myriad societal, moral, and political structures are predicated on the idea that people are able to choose and act autonomously. Consequently, as the most cursory glance at the applied ethics literature will confirm, the violation of autonomy in various contexts is often taken to be unethical” (Smith & de Villiers-Botha, 2021).
Indeed, even when Thaler and Sunstein first proposed nudge theory, they faced criticism on the fact that nudges could be used for illegitimate purposes, and that they were a form of deception since they exploited cognitive weaknesses (Bovens, 2008). Wendler states that deception is morally wrong because it violates the autonomy of the person deceived, involving the control of another without that person’s consent (Wendler, 1996).
If we take Ryan and Patrick’s definition of autonomy for children referring to their freedom to set their own goals and objectives (Ryan et al., 2009), it is clear that hypernudging violates that. Taking an example from Antle and Kitson, we can imagine an on-skin monitor of a child’s glucose, with an algorithm to tell the child to eat more or eat less. This would take away their independent decision-making, with the worse case being the development of food-related disorders (Antle & Kitson, 2021).
Siegler et al note that the development of a strong sense of autonomy is important for a child’s sense of themselves as secure and confident (Siegler et al., 2017), and Emily Buss notes that for young children, the right to make certain decisions allows them to practice thinking to develop their future capacity for meaningful agency (Buss, 2009). From this, we can see how damaging it is to a child’s development for a mechanism such as hypernudging to take autonomy away from them, especially for children of a young age.
Not only does hypernudging pose a threat to children’s autonomy, but it also violates wider children’s rights. A theory such as social contract theory doesn’t provide rights to children since they are not widely considered to possess equal physical and mental capacities as adults, but other approaches provide justification for why children’s rights should be recognised as human rights.
Dixon and Nussbaum argue that a capabilities approach (Nussbaum, 2009) “provides a clear account for why children's rights should be recognized as human rights, because every human being, under this approach, is entitled to respect for her full human dignity”. They argue that the underdeveloped capabilities of children “exert a moral claim that they should be developed up to the point at which they reach the threshold level of each capability specified on the capabilities list”, and that these capabilities provide two justifications for granting special priority to children’s rights - the first is their vulnerability because of dependency on adults, and the other is that if protecting their rights could be done at a marginal cost and wasn’t, then it would be a direct affront of the child’s dignity (Dixon & Nussbaum, 2012).
Feinberg coined the idea of a “child’s right to an open future” to refer to the moral rights (a set of autonomy rights-in-trust) to be saved for a child until they become an adult. According to him, “Children cannot exert these rights yet, but these rights can be violated before the child acquires the capability to act autonomously” (Feinberg, 1980).
However, in the case of children, it can be hard to find the balance between protecting children while still empowering them. Macenaite argues in her exploration of applying the GDPR guidelines to children that “There are specific dilemmas that the introduction of the child-tailored online privacy protection regime creates – the ‘empowerment versus protection’ and the ‘individualized versus average child’ dilemmas” (Macenaite, 2017).
Another consideration to make is that current technology tends to be targeted to the “average” user: “biowearables are often designed from the perspective of an affluent, active, adult male, which may not be applicable to many other users” (Antle & Kitson, 2021), so for technology not made specifically for children, this will mean that they are using things targeted at a very different user than themselves.
Goodyear et al. argue that the lack of child-specific biowearable devices reduces autonomy and agency in young people in terms of being able to “examine their subjective feelings about their health and their bodies'' (Goodyear et al., 2019). It should be noted that even if we were to make child-specific technologies, they would still be targeted towards a "normal" child, thus excluding children who are outside the norm for their age group, for example children with disabilities.
Something that cannot be ignored is that hypernudging is, by definition, something that is possible by having huge amounts of data on an individual, and thus a lot of hypernudging that takes place is from large tech corporations. As Véliz points out “Many companies would gladly take away all of our freedom. Such corporate disregard for autonomy is a new type of soft authoritarianism” (Véliz, 2021). This is also recognised by The Campaign for a Commercial Free Childhood in their Hell No Barbie campaign, which called the Hello Barbie doll’s practices “creepy” and argued “Kids using Hello Barbie aren’t only talking to a doll, they are talking directly to a toy conglomerate whose only interest in them is ﬁnancial” (Lobosco, 2015).
Zuboff points out the massive power asymmetry between large tech companies and their users (Zuboff, 2015), and it’s clear that by continuing to engage with these company’s products, the relationship between these companies and their users becomes what Dow Schüll refers to as “asymmetric collusion” (Schüll, 2014).
It should be noted that as a society, we do tend to accept removing some autonomy from children when it’s for the child’s good, but while what is classified as a “child’s good” may be a subject of debate, manipulating them for a company's profit is likely not a morally-sound outcome. Smith and de Villiers-Botha argue “even if there are legitimate instances of hypernudging children, private companies are not the appropriate parties for carrying out such hypernudging. Only accountable parties that are mandated with looking after the welfare of the children involved should be allowed to interfere with their decision-making in this way” (Smith & de Villiers-Botha, 2021).
The collection and usage of large amounts of data also have strong implications for an individual’s right to informational privacy, and as Yeung points out "the right to informational privacy includes the ‘purpose speciﬁcation principle’, requiring data collectors to state clearly the explicit purpose of collecting and processing that data at the time of collection (Yeung, 2017). Yet, companies often collect more data than necessary in the hope that they will find a use for it, making the ability to give informed consent extremely hard. In the context of big data, notice is often either too restrictive to be able to get value from data, or too empty to protect an individual’s privacy (Nissenbaum, 2009).
As a final note on the dangers of hypernudging for children, we should note that although there is existing legislation to protect children from the danger of technology, these often aren’t very efficient at protecting children’s autonomy: “these protections are often inadequate in protecting children against autonomy-related harms stemming from maleficent forms of online decision interference. This is partly due to the primary focus of such legislation being on protecting the values of privacy and consent when it comes to obtaining and processing data on children” (Smith & de Villiers-Botha, 2021).
Legislation can often hinge on there being evidence that shows the negative impact of it, but Gasper points out that clinical psychology studies can take a long time to produce reliable data, and in the meanwhile, we will be “exposing several generations of children to the examined risk in the process” (Gaspar, 2018).
4.1 Hypernudging as a positive force for children
There could be an argument for the usage of hypernudging for positive effects on children, for example, Mills argues that hypernudging could help to significantly improve people’s behaviours around health and finance (Mills, 2019), and the UN Global Pulse explores areas that big data could have positive effects in, including ones that can benefit children, such as early warning signs for natural disasters, and improved access to services (UN Global Pulse, 2013).
Hypernudging for positive effect would still need a lot of Big Data to work, and this tends to be the norm for modern technology: Mayer-Schönberger explores that remembering information has become the default mode in our technology since there is no longer an inherent cost to keeping data stored (Mayer-Schönberger, 2011). However, there is cause for concern with this - security expert Bruce Schneier argues that “data is a toxic asset” (Schneier, n.d.), and the European Union recognises this with the right to be forgotten (now replaced by the more limited right of erasure).
Véliz points out that “Forgetting is not only a virtue for individuals, but also for societies. Social forgetting provides second chances … Societies that remember it all tend to be unforgiving” (Véliz, 2021). A compelling proposal is that from Renda, who proposes the “clean data slate” for children: that there should be no data stored at all on children up to a certain age (Renda, 2020).
Another argument against the usage of big data (for hypernudging and also other general forms of microtargeting) is that so-called “anonymous” data often turns out to not be anonymous at all. A well-known case is that of Latanya Sweeney, who in 1997 showed Governor William Weld that she could find his medical records in an anonymised dataset from the Massachusetts Group Insurance Commission. More recently, in 2013 she showed that she could correctly identify between 84-97% of “anonymous” participants in the Personal Genome Project (Sweeney et al., 2013).
Building on Papacharissi’s idea of a “networked self”, i.e. an amalgamation of online identifies across various platforms (Papacharissi, 2010), Helmond discusses that an individual’s online identity is in perpetual beta since it is constantly being updated with new information and that an individual’s online data can also be generated by other users (Helmond, n.d.). We can see how in the case of children, it is possible for them to have an online presence before even having their own accounts, via parents putting up pictures of their children from a young age.
Considering the lack of anonymity that is possible with data, and that today’s children will have more data online about them than ever before, it makes it all the more possible to have even more strongly-targeted hypernudges for children, and this could impact their education, future job prospects, and even give tech companies the ability to shape children’s political views from a very young age. These possibilities will far overshadow any positives by fundamentally undermining many children’s rights throughout their lives.
4.2 Consenting to hypernudging
An argument could be made that hypernudging is ethically okay if individuals, including children, are presented with notice around what nudging will take place and they consent to it. Yeung points out that “liberal political theory has little to say about such techniques – if their use is adequately disclosed and duly consented to, there is nothing further of concern: individual autonomy is respected, while the market mechanism fosters innovation in the digital services industry" (Yeung, 2017).
However, terms and conditions are well-known for being difficult to understand, and rarely something that consumers understand before consenting. Chung and Grimes’ analysis of child-based websites revealed no clear guidelines on the way informed consent is taken on these sites, and that the notices were often given in language unlikely to be understood by adults, let alone children (Chung & Grimes, 2005).
The GDPR guidelines acknowledge that this is the case, and provides extra requirements for a request for erasure when it relates to children’s data since it considers that a child may not have been fully aware of the risks of consenting, i.e. they did not give free and informed consent.
4.3 Developing child-specific technologies
While the idea of making technology which will provide children with a more controlled experience of the world may sound good, in reality, many measures taken will be able to be easily circumvented with various levels of social engineering. Take the example of wanting to resolve the problem of social media platforms recommending unknown adults as friend suggestions to children on the platform. The 5Rights Foundation found that “in 2020, 75% of the top 12 most popular social media platforms used AI to recommend children’s profiles to strangers” (5Rights Foundation, n.d.-a).
One suggestion for resolving this is to only show other appropriately-aged children as friend suggestions to children on a social media platform. But this could be abused by malicious adult actors signing up with a profile of a child (already a common tactic in catfishing scenarios). This would also rely on children stating their correct age when signing up to a platform, when in reality there is no easy way to enforce this, and they may be incentivised to lie about their age if they know there will be less restrictive features on a platform if they sign up as an adult user.
It should be noted that various platforms have (and still currently do) exist that are child-specific, most notably YouTube Kids, where the most popular channels for children have tens of millions of subscribers (Ceci, 2022). Some child-specific technology has been successfully campaigned against, including the previously mentioned Hello Barbie, and Mattel’s “AI-powered babysitter” Aristotle (Hern, 2017). However platforms such as YouTube Kids remain in popular use, and despite YouTube launching a separate app and experience for children, it’s still overrun with a variety of inappropriate videos, with one study finding that 34% of the search results for the popular TV show “Peppa pig” are inappropriate videos (Papadamou et al., 2019).
4.4 Preparing children for hypernudging in adulthood
While hypernudging exists in many online platforms, it may be tempting to say that children should be exposed to it from a younger age in order to prepare them for life as an adult online.
However, as discussed before, terms and conditions often aren’t even easy to understand for adults, and considering how complex some of the hypernudging might be, it is doubtful that effective terms and conditions that could be understood by small children could ever be made to explain the usage of their data.
Véliz also points out that it would normalise something unethical - “What excessive surveillance does is teach kids that human rights do not have to be respected. We cannot realistically expect people who have been taught as children that their rights do not matter to have any respect for rights as adults ... By oversurveilling children, by oppressing them with a Thought Police, we risk bringing up a generation of people who were never allowed to grow up” (Véliz, 2021).
Finally, it should be noted that a lot of the reasons for considering hypernudging as a threat to children’s autonomy is focused around a lack of understanding on the children’s part around the tech that they are using and how it might act maliciously in their lives. Véliz argues that “Magic tricks catch our attention and inspire awe in us, even when we know they are illusions. The spell is only broken once we are told how the trick is done. In the same way, understanding how personalized content gets designed and for what purposes might take away some of its power – it might break the spell” (Véliz, 2021).
The UK Children's Commissioner's report suggests a compulsory digital citizenship program for children 4-14, and many researchers have also pointed out the need for better IT literacy in children. An interesting proposal is by Tissenbaum, Sheldon & Abelson, who explore computing education that "both teaches and empowers" and propose a new framing around "computational action" that engages children by having them create tech for their own lives, allow them to contextualise the design ethics required for creating technology (Tissenbaum et al., 2019).
With this in mind, it’s possible that hypernudging would constitute less of a threat to children's autonomy if children had better literacy around their tech and privacy, although this would likely still not completely resolve the issues.
In conclusion, we can see how hypernudging constitutes a threat to children in particular, since they are in a more vulnerable stage of their lives than adults and more susceptible to deceptive practices. Furthermore, considering that the hypernudges are likely to be from companies that have no interest in an individual’s wellbeing, we can see that these nudges are unlikely to be to an individual’s benefit, and are more likely to focus on increasing a company’s profit.
With children today growing up in a more interconnected and online world than ever, they are more vulnerable than generations before them to be the victim of hypernudging, due to the large amounts of data on them throughout all parts of their lives. It’s also likely that the threat of hypernudging encroaching on children’s autonomy continues to grow in the upcoming years, even if technology literacy increases within children.
5Rights Foundation. (n.d.-a). Friend suggestion systems. Risky-By-Design. Retrieved 22 March 2022, from https://www.riskyby.design/friend-suggestions
5Rights Foundation. (n.d.-b). In-game purchases. Risky-By-Design. Retrieved 22 March 2022, from https://www.riskyby.design/in-game-purchases
An, S., & Stern, S. (2011). Mitigating the Effects of Advergames on Children. Journal of Advertising, 40(1), 43–56. https://doi.org/10.2753/JOA0091-3367400103
Antle, A. N., & Kitson, A. (2021). 1,2,3,4 tell me how to grow more: A position paper on children, design ethics and biowearables. International Journal of Child-Computer Interaction, 30, 100328. https://doi.org/10.1016/j.ijcci.2021.100328
Aries, P. (1965). Centuries of Childhood: A Social History of Family Life. Vintage.
Bovens, L. (2008). The Ethics of Nudge. In M. J. Hansson & T. Grüne-Yanoff (Eds.), Preference Change: Approaches from Philosophy, Economics and Psychology. (pp. 207–220). Berlin: Springer, Theory and Decision Library A.
Brighouse, H., & Swift, A. (2016). Family Values: The Ethics of Parent-Child Relationships (Reprint edition). Princeton University Press.
Burr, C., Taddeo, M., & Floridi, L. (2020). The Ethics of Digital Well-Being: A Thematic Review. Science and Engineering Ethics, 26(4), 2313–2343. https://doi.org/10.1007/s11948-020-00175-8
Buss, E. (2009). What the Law Should (And Should Not) Learn from Child Development Research. Hofstra Law Review, 38, 13.
Ceci, L. (2022, March 18). YouTube most subscribed kids content channels 2022. Statista. https://www.statista.com/statistics/785626/most-popular-youtube-children-channels-ranked-by-subscribers/
Childnet. (2015, April 29). Youth Manifesto for a Better Internet. Childnet. https://www.childnet.com/blog/youth-manifesto-for-a-better-internet/
Christman, J. (1991). Autonomy and Personal History. Canadian Journal of Philosophy, 21(1), 1–24. https://doi.org/10.1080/00455091.1991.10717234
Chung, G., & Grimes, S. M. (2005). Data Mining the Kids: Surveillance and Market Research Strategies in Children’s Online Games. Canadian Journal of Communication, 30(4), Article 4. https://doi.org/10.22230/cjc.2005v30n4a1525
De Jans, S., Cauberghe, V., & Hudders, L. (2018). How an Advertising Disclosure Alerts Young Adolescents to Sponsored Vlogs: The Moderating Role of a Peer-Based Advertising Literacy Intervention through an Informational Vlog. Journal of Advertising, 47(4), 309–325. https://doi.org/10.1080/00913367.2018.1539363
De Pauw, P., Hudders, L., & Cauberghe, V. (2018). Disclosing brand placement to young children. International Journal of Advertising, 37(4), 508–525. https://doi.org/10.1080/02650487.2017.1335040
Diakopoulos, N. (2013, October 3). Rage Against the Algorithms. The Atlantic. https://www.theatlantic.com/technology/archive/2013/10/rage-against-the-algorithms/280255/
Dixon, R., & Nussbaum, M. (2012). Children’s Rights and a Capabilities Approach: The Question of Special Priority. Cornell Law Review, 97(3), 549.
Dworkin, G. (1988). The Theory and Practice of Autonomy. Cambridge University Press. https://doi.org/10.1017/CBO9780511625206
Feinberg, J. (1980). Whose Child?: Children’s Rights, Parental Authority, and State Power. Littlefield Adams.
Forst, R. (2015). Noumenal Power. Journal of Political Philosophy, 23(2), 111–127. https://doi.org/10.1111/jopp.12046
Gaspar, U. (2018). Children at Play: Thoughts about the impact of networked toys in the game of life and the role of law. The International Review of Information Ethics, 27. https://doi.org/10.29173/irie100
Gibbs, S. (2015, March 13). Privacy fears over ‘smart’ Barbie that can listen to your kids. The Guardian. https://www.theguardian.com/technology/2015/mar/13/smart-barbie-that-can-listen-to-your-kids-privacy-fears-mattel
Giesinger, J. (2019). Vulnerability and Autonomy – Children and Adults. Ethics and Social Welfare, 13(3), 216–229. https://doi.org/10.1080/17496535.2019.1647262
Goodyear, V. A., Armour, K. M., & Wood, H. (2019). Young people learning about health: The role of apps and wearable devices. Learning, Media and Technology, 44(2), 193–210. https://doi.org/10.1080/17439884.2019.1539011
Helmond, A. (n.d.). Essay on Identity 2.0: Constructing identity with cultural software. Retrieved 22 March 2022, from https://www.annehelmond.nl/2010/01/21/essay-on-identity-2-0-constructing-identity-with-cultural-software/
Hern, A. (2017, October 6). ‘Kids should not be guinea pigs’: Mattel pulls AI babysitter. The Guardian. https://www.theguardian.com/technology/2017/oct/06/mattel-aristotle-ai-babysitter-children-campaign
Hoffman, M. L. (1982). Affect and moral development. New Directions for Child and Adolescent Development, 1982(16), 83–103. https://doi.org/10.1002/cd.23219821605
Keymolen, E., & Van der Hof, S. (2019). Can I still trust you, my dear doll? A philosophical and legal exploration of smart toys and trust. Journal of Cyber Policy, 4(2), 143–159. https://doi.org/10.1080/23738871.2019.1586970
King, D. L., Delfabbro, P. H., Gainsbury, S. M., Dreier, M., Greer, N., & Billieux, J. (2019). Unfair play? Video games as exploitative monetized services: An examination of game patents from a consumer protection perspective. Computers in Human Behavior, 101, 131–143. https://doi.org/10.1016/j.chb.2019.07.017
Lobosco, K. (2015, March 11). Talking Barbie is too creepy for some parents. CNNMoney. https://money.cnn.com/2015/03/11/news/companies/creepy-hello-barbie/index.html
Macenaite, M. (2017). From universal towards child-specific protection of the right to privacy online: Dilemmas in the EU General Data Protection Regulation. New Media & Society, 19(5), 765–779. https://doi.org/10.1177/1461444816686327
Martínez, C., & Olsson, T. (2019). Making sense of YouTubers: How Swedish children construct and negotiate the YouTuber Misslisibell as a girl celebrity. Journal of Children and Media, 13(1), 36–52. https://doi.org/10.1080/17482798.2018.1517656
Matthews, G. B. (2009, October 30). Philosophy and Developmental Psychology. The Oxford Handbook of Philosophy of Education. https://doi.org/10.1093/oxfordhb/9780195312881.003.0010
Mayer-Schönberger, V. (2011). Delete: The Virtue of Forgetting in the Digital Age (Revised edition). Princeton University Press.
Mills, S. (2019). Into Hyperspace: An Analysis of Hypernudges and Personalised Behavioural Science (SSRN Scholarly Paper ID 3420211). Social Science Research Network. https://doi.org/10.2139/ssrn.3420211
Mullin, A. (2007). Children, Autonomy, and Care. Journal of Social Philosophy, 38, 536–553. https://doi.org/10.1111/j.1467-9833.2007.00397.x
Nissenbaum, H. (2009). Privacy in Context: Technology, Policy, and the Integrity of Social Life (1st edition). Stanford Law Books.
Nussbaum, M. C. (2009). Creating Capabilities: The Human Development Approach and Its Implementation. Hypatia, 24(3), 211–215.
Ofcom. (2021, July 13). Children and parents: Media use and attitudes report 2020/21. Ofcom. https://www.ofcom.org.uk/research-and-data/media-literacy-research/childrens/children-and-parents-media-use-and-attitudes-report-2021
Papacharissi, Z. A. (2010). A Private Sphere: Democracy in a Digital Age (1st edition). Polity.
Papadamou, K., Papasavva, A., Zannettou, S., Blackburn, J., Kourtellis, N., Leontiadis, I., Stringhini, G., & Sirivianos, M. (2019). Disturbed YouTube for Kids: Characterizing and Detecting Inappropriate Videos Targeting Young Children. ArXiv:1901.07046 [Cs]. http://arxiv.org/abs/1901.07046
Piaget, J., & Inhelder, B. (1974). The Child’s Construction of Quantities: Conservation and Atomism. Psychology Press.
Renda, A. (2020, July 9). Europe: Toward a Policy Framework for Trustworthy AI. The Oxford Handbook of Ethics of AI. https://doi.org/10.1093/oxfordhb/9780190067397.013.41
Rousseau, J.-J. (1979). Emile: Or On Education (A. Bloom, Trans.). Basic Books.
Rughiniş, C., Rughiniş, R., & Matei, Ş. (2015). A touching app voice thinking about ethics of persuasive technology through an analysis of mobile smoking-cessation apps. Ethics and Information Technology, 17(4), 295–309. https://doi.org/10.1007/s10676-016-9385-1
Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55(1), 68–78. https://doi.org/10.1037/0003-066X.55.1.68
Ryan, R. M., Williams, G. C., Patrick, H., & Deci, E. L. (2009). Self-determination theory and physical activity: The dynamics of motivation in development and wellness. Hellenic Journal of Psychology, 6(2), 107–124. Scopus.
Sætra, H. S. (2019). When nudge comes to shove: Liberty and nudging in the era of big data. Technology in Society, 59, 101130. https://doi.org/10.1016/j.techsoc.2019.04.006
Schneier, B. (n.d.). Data Is a Toxic Asset—Schneier on Security. Retrieved 22 March 2022, from https://www.schneier.com/blog/archives/2016/03/data_is_a_toxic.html
Schüll, N. D. (2014). Addiction by Design: Machine Gambling in Las Vegas (New in Paper edition). Princeton University Press.
Siegler, R. S., Saffran, J., Eisenberg, N., DeLoache, J. S., Gershoff, E., & Leaper, C. (2017). How Children Develop (Fifth edition). Worth Publishers.
Smith, J., & de Villiers-Botha, T. (2021). Hey, Google, leave those kids alone: Against hypernudging children in the age of big data. AI & SOCIETY. https://doi.org/10.1007/s00146-021-01314-w
Steeves, V. (2020). A dialogic analysis of Hello Barbie’s conversations with children. Big Data & Society, 7(1), 2053951720919151. https://doi.org/10.1177/2053951720919151
Sweeney, L., Abu, A., & Winn, J. (2013). Identifying Participants in the Personal Genome Project by Name (SSRN Scholarly Paper ID 2257732). Social Science Research Network. https://doi.org/10.2139/ssrn.2257732
Sziron, M., & Hildt, E. (2018). Digital Media, the Right to an Open Future, and Children 0–5. Frontiers in Psychology, 9. https://www.frontiersin.org/article/10.3389/fpsyg.2018.02137
Thaler, R. H., & Sunstein, C. R. (2009). Nudge: Improving Decisions About Health, Wealth, and Happiness (Revised&Expanded edition). Penguin Books.
Tissenbaum, M., Sheldon, J., & Abelson, H. (2019). From computational thinking to computational action. Communications of the ACM, 62(3), 34–36. https://doi.org/10.1145/3265747
UN Global Pulse. (2013, June 1). Big Data for Development—Primer • UN Global Pulse. https://www.unglobalpulse.org/document/big-data-for-development-primer/
Vanwesenbeeck, I., Walrave, M., & Ponnet, K. (2017). Children and advergames: The role of product involvement, prior brand attitude, persuasion knowledge and game attitude in purchase intentions and changing attitudes. International Journal of Advertising, 36(4), 520–541. https://doi.org/10.1080/02650487.2016.1176637
Véliz, C. (2021). Privacy is Power: Why and How You Should Take Back Control of Your Data. Bantam Press.
Wendler, D. (1996). Deception in medical and behavioral research: Is it ever acceptable? The Milbank Quarterly, 74(1), 87–114.
Yeung, K. (2017). ‘Hypernudge’: Big Data as a mode of regulation by design. Information, Communication & Society, 20(1), 118–136. https://doi.org/10.1080/1369118X.2016.1186713
Zuboff, S. (2015). Big other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75–89. https://doi.org/10.1057/jit.2015.5