algorithm | big data | black boxes | citizenship | COPPA
critical pedagogy | data mining | digital footprint | digital redlining,
FERPA | filter bubble | online identity | personalization | quantified self
At the simplest level, algorithms are mathematical formulas. The formulas accept inputs from pre-determined domains, and they produce outputs within specific discourses. Thus, they seem to offer an analogy to human analysis and cognitive processes. Generally, the notion suggests that from a relatively small set of rules, large-scale insights can be produced. More complex notions of algorithms recognize the concept as part of the culture's ongoing anxiety about building complete and consistent representational systems.
This non-technical term recognizes that the ability to collect immense numbers of data points has become a dominant feature of our digital culture. The term is shorthand for the databases, analytic tools, and surveillance tools that have become dominant forces.
Blackboxes are defined by their inputs and outputs. The processes that transform the input into an output are invisible, and this opaqueness gives rise to the term. Because the internal workings are invisible, they direct the user to questions of function and utility rather than to questions about the values, beliefs, tactics, and strategies that have shaped the "rules" within the box. Thus, the term suggests indifference to questions of ideology, power, authority, and politics. If the processes of a "box" can be examined, the system is called a "white box."
Currently, the notion of citizenship is contested. Those on the right see citizenship as a set of economic rights which dictate legal, political and social structures. Self-interest is seen as the regulating force for social stability. Those on the left tend to see citizenship as a feature of the fundamentally social nature of humans. As creations of social structures, the citizen has a dual responsibility: first, to participate in collective activity; second, to continually critique public policies, laws, social structures, and economics for the collective good.
The "Children's Online Privacy Protection Act of 1998" (COPPA), regulates the collection of information from children by a website. The FTC summary of the law is clear:
"As marketers know, the Rule puts certain requirements in place if you operate a website or online service directed to children under 13 or if you have actual knowledge that you’re collecting personal information online from kids in that age group. Even with today’s announcement, most big-picture COPPA principles remain unchanged. You still have to give notice to parents and get their verifiable consent before collecting, using, or disclosing personal information from children under 13. You still have to keep kids’ information secure and you can’t condition their participation in activities on the collection of more personal info than is reasonably necessary to take part. And the new Rule retains 'safe harbor' provisions so that groups can submit programs for FTC approval."
Henry Giroux offers the following definiton of Critical Pedagogy: "The critical question here is whose future, story, and interests does the school represent." Asking that question re-asserts the power of teachers, administrators, students, and the community in an environment increasingly controlled by corporate interests. Peter McLaren further notes that Critical Pedagogy is an approach adopted by progressive teachers, administrators, and communities attempting to eliminate inequalities on the basis of social class, race, and sex.
Critical educators ask about whose knowledge has the most value in our culture, and what beliefs, ideologies, and politics have determined that value. Critical educators ask how the structure of curricula and teaching practices contribute to and/or resist social stratification. Throughout, critical pedagogy foregrounds the relationship of knowledge and power and asks how to make the critique of that relationship central to education. The goal of critical pedagogy is to enable the critical thinking that underlies democracy.
Digital tools produce immense numbers of data points, but this mountain of information acquires meaning and value only when it is queried, organized, and used for specific purposes. The transformation of digitized information into knowledge is termed "data mining."
The electronic record left by the use of digital tools such as search engines, databases, social media, cell phones, credit cards, and the myriad of electronic devices that shape our daily lives is called a "digital footprint." Like a footprint along a beach, it is the history of a journey, but in this case a journey through the information we choose to engage.
"Redlining" originally referred to charging more (or refusing) to provide financial services, insurances, etc. The term refers to the banking practice of drawing red lines around areas where specific ethnic or racial groups lived. The term has been generalized to include any form of discrimination based on profile that indirectly identify people on the basis of ethnicity, gender, race, and other such categories.. Digital redlining refers to digital profiling that produces discriminatory practices. Data mining can produce such unintended discrimination.
The Family Educational and Privacy Rights Act (FERPA) is a Federal law granting parents rights to their children's academic files. Parents have rights to access records, ask to have the records amended, and to control disclosure of their contents. Once a student turns 18, a school must have the student's approval to release information. When an educational institution transfers information to a third party, these rights change. This change is exploited by a variety of educational corporations.
Eli Pariser's term describes how the history of our digital searches creates "personalized" results that reflect the interests, values and beliefs implicit in these past searches. Because the search algorithm only provides information that agrees with her/his viewpoint, the search engine becomes an invisible agent of confirmation bias. Users do not encounter complicating or contradictory viewpoints. Pariser sees the filter bubble as a source of political and intellectual intolerance.
We play different roles in different situations, and for each we have a different persona. Digital environments are no different than any other context except that they are relatively new. Thus, on the internet, a user establishes another identity. Those who see the individual as possessed of a genuine, authentic identity see this as problematic. Those who see identity as contextual, see it as an iteration of a social process.
Personalization is the result of algorithms that provide search results that reinforce the values, beliefs, and interests expressed in previous searches. Personalization creates a filter bubble that excludes contradictory ideas and viewpoints.
The quantified self is the result of the ongoing collection of data about a person's daily life. Recording an immense set of data points about health, activities, transportation, consumption,environmental factors, and mood can be sensed and monitored digitally. Wearable sensors enable this quantified self to generate inputs whose analysis provides advice, insight, and protection for the physical self. While seemingly beneficial, the process of quantifying the self comes at the cost of privacy and control.