[hackerspaces] An interesting point of view : "On Feminism and Microcontrollers"

Michel Bauwens michelsub2004 at gmail.com
Sat Oct 2 12:01:33 CEST 2010

Some background on protocollar power and intentional design, taken from
various sources:

On Sat, Oct 2, 2010 at 11:04 AM, Alexandre Dulaunoy <a at foo.be> wrote:

> For sharing with you,
> Leah Buechley and Benjamin Mako Hill made an interesting
> comparative paper[1] about LilyPad and Arduino.
>   [1] http://hlt.media.mit.edu/publications/buechley_DIS_10.pdf

Design is power: A review of issues around the concept of protocollary power

Michel Bauwens
3rd October 2010

Protocally Power is a concept developed by Alexander Galloway in his book
Protocol, to denote the new way power and control are exercised in
distributed networks.

(See also, in the P2P Foundation wiki, our entries on the Architecture of
Control and on Computing Regimes.)

Here is the description of the concept from Alexander Galloway in his book

“Protocol is not a new word. Prior to its usage in computing, protocol
referred to any type of correct or proper behavior within a specific system
of conventions. It is an important concept in the area of social etiquette
as well as in the fields of diplomacy and international relations.
Etymologically it refers to a fly-leaf glued to the beginning of a document,
but in familiar usage the word came to mean any introductory paper
summarizing the key points of a diplomatic agreement or treaty.

However, with the advent of digital computing, the term has taken on a
slightly different meaning. Now, protocols refer specifically to standards
governing the implementation of specific technologies. Like their diplomatic
predecessors, computer protocols establish the essential points necessary to
enact an agreed-upon standard of action. Like their diplomatic predecessors,
computer protocols are vetted out between negotiating parties and then
materialized in the real world by large populations of participants (in one
case citizens, and in the other computer users). Yet instead of governing
social or political practices as did their diplomatic predecessors, computer
protocols govern how specific technologies are agreed to, adopted,
implemented, and ultimately used by people around the world. What was once a
question of consideration and sense is now a question of logic and physics.

To help understand the concept of computer protocols, consider the analogy
of the highway system. Many different combinations of roads are available to
a person driving from point A to point B. However, en route one is compelled
to stop at red lights, stay between the white lines, follow a reasonably
direct path, and so on. These conventional rules that govern the set of
possible behavior patterns within a heterogeneous system are what computer
scientists call protocol. Thus, protocol is a technique for achieving
voluntary regulation within a contingent environment.

These regulations always operate at the level of coding–they encode packets
of information so they may be transported; they code documents so they may
be effectively parsed; they code communication so local devices may
effectively communicate with foreign devices. Protocols are highly formal;
that is, they encapsulate information inside a technically defined wrapper,
while remaining relatively indifferent to the content of information
contained within. Viewed as a whole, protocol is a distributed management
system that allows control to exist within a heterogeneous material milieu.

It is common for contemporary critics to describe the Internet as an
unpredictable mass of data–rhizomatic and lacking central organization. This
position states that since new communication technologies are based on the
elimination of centralized command and hierarchical control, it follows that
the world is witnessing a general disappearance of control as such.

This could not be further from the truth. I argue in this book that protocol
is how technological control exists after decentralization. The “after” in
my title refers to both the historical moment after decentralization has
come into existence, but also–and more important–the historical phase after
decentralization, that is, after it is dead and gone, replaced as the
supreme social management style by the diagram of distribution.”

The following citations confirm the role of Design, and the intention behind
it, as a function of Protocollary Power:

Mitch Ratfliffe:

“Yes, networks are grown. But the medium they grow in, in this case the
software that supports them, is not grown but designed & architected. The
social network ecosystem of the blogosphere was grown, but the blog software
that enabled it was designed. Wikis are a socially grown structure on top of
software that was designed. It’s fortuitous that the social network
structures that grew on those software substrates turn out to have
interesting & useful properties.

With a greater understanding of which software structures lead to which
social network topologies & what the implications are for the robustness,
innovativeness, error correctiveness, fairness, etc. of those various
topologies, software can be designed that will intentionally & inevitably
lead to the growth of political social networks that are more robust,
innovative, fair & error correcting.”

Mitch Kapor on ‘Politics is Architecture‘

“Politics is architecture”: The architecture (structure and design) of
political processes, not their content, is determinative of what can be
accomplished. Just as you can’t build a skyscraper out of bamboo, you can’t
have a participatory democracy if power is centralized, processes are
opaque, and accountability is limited.”

Fred Stutzman on Pseudo-Govermental Decisions in Social Software

“When one designs social software, they are forced to make
pseudo-governmental decisions about how the contained ecosystem will behave.
Examples of these decisions include limits on friending behavior, limits on
how information in a profile can be displayed, and how access to information
is restricted in the ecosystem. These rules create and inform the structural
aspects of the ecosystem, causing participants in the ecosystem to behave a
specific way.

As we use social software more, and social software more neatly integrates
with our lives, a greater portion of our social rules will come to be
enforced by the will of software designers. Of course, this isn’t new – when
we elected to use email, we agree to buy into the social consequences of
email. Perhaps because we are so used to making tradeoffs when we adopt
social technology, we don’t notice them anymore. However, as social
technology adopts a greater role in mediating our social experience, it will
become very important to take a critical perspective in analyzing how the
will of designers change us.”

Here’s an example of the implementation of social Values in Technical Code:

“In a paper about the hacker community, Hannemyr compares and contrasts
software produced in both open source and commercial realms in an effort to
deconstruct and problematize design decisions and goals. His analysis
provides us with further evidence regarding the links between social values
and software code. He concludes:

“Software constructed by hackers seem to favor such properties as
flexibility, tailorability, modularity and openendedness to facilitate
on-going experimentation. Software originating in the mainstream is
characterized by the promise of control, completeness and immutability”
(Hannemyr, 1999).

To bolster his argument, Hannemyr outlines the striking differences between
document mark-up languages (like HTML and Adobe PDF), as well as various
word processing applications (such as TeX and Emacs verses Microsoft Word)
that have originated in open and closed development environments. He
concludes that “the difference between the hacker’s approach and those of
the industrial programmer is one of outlook: between an agoric, integrated
and holistic attitude towards the creation of artifacts and a proprietary,
fragmented and reductionist one” (Hannemyr, 1999). As Hannemyr’s analysis
reveals, the characteristics of a given piece of software frequently reflect
the attitude and outlook of the programmers and organizations from which it

Armin Medosch shows how corporate-owned Social Media platforms are
Re-introducing centralization through the back door:

“In media theory much has been made of the one-sided and centralised
broadcast structure of television and radio. the topology of the broadcast
system, centralised, one-to-many, one-way, has been compared unfavourable to
the net, which is a many-to-many structure, but also one-to-many and
many-to-one, it is, in terms of a topology, a highly distributed or mesh
network. So the net has been hailed as finally making good on the promise of
participatory media usage. What so called social media do is to re-introduce
a centralised structure through the backdoor. While the communication of the
users is ‘participatory’ and many-to-many, and so on and so forth, this is
organised via a centralised platform, venture capital funded, corporately
owned. Thus, while social media bear the promise of making good on the
emancipatory power of networked communication, in fact they re-introduce the
producer-consumer divide on another layer, that of host/user. they perform a
false aufhebung of the broadcast paradigm. Therefore I think the term
prosumer is misleading and not very useful. while the users do produce
something, there is nothing ‘pro’ as in professional in it.

This leads to a second point. The conflict between labour and capital has
played itself out via mechanization and rationalization, scientific
management and its refinement, such as the scientific management of office
work, the proletarisation of wrongly called ‘white collar work’, the
replacement of human labour by machines in both the factory and the office,
etc. What this entailed was an extraction of knowledge from the skilled
artisan, the craftsman, the high level clerk, the analyst, etc., and its
formalisation into an automated process, whereby this abstraction decidedly
shifts the balance of power towards management. Now what happened with the
transition from Web 1.0 to 2.0 is a very similar process. Remember the
static homepage in html? You needed to be able to code a bit, actually for
many non-geeks it was probably the first satisfactory coding experience
ever. You needed to set the links yourself and check the backlinks. Now a
lot of that is being done by automated systems. The linking knowledge of
freely acting networked subjects has been turned into a system that suggests
who you link with and that established many relationships involuntarily. It
is usually more work getting rid of this than to have it done for you.
Therefore Web 2.0 in many ways is actually a dumbing down of people, a
deskilling similar to what has happened in industry over the past 200 years.

Wanted to stay short and precise, but need to add, social media is a
misnomer. What social media would be are systems that are collectively owned
and maintained by their users, that are built and developed according to
their needs and not according to the needs of advertisers and sinister
powers who are syphoning off the knowledge generated about social
relationships in secret data mining and social network analysis processes.

So there is a solution, one which I continue to advocate: lets get back to
creating our own systems, lets use free and open source software for server
infrastructures and lets socialise via a decentralised landscape of smaller
and bigger hubs that are independently organised, rather than feeding the
machine …” (IDC mailing list, Oct 31, 2009)

Harry Halpin insists that Protocols are Designed by People:

“Galloway is correct to point out that there is control in the internet, but
instead of reifying the protocol or even network form itself, an ontological
mistake that would be like blaming capitalism on the factory, it would be
more suitable to realise that protocols embody social relationships. Just as
genuine humans control factories, genuine humans – with names and addresses
– create protocols. These humans can and do embody social relations that in
turn can be considered abstractions, including those determined by the
abstraction that is capital. But studying protocol as if it were first and
foremost an abstraction without studying the historic and dialectic movement
of the social forms which give rise to the protocols neglects Marx’s insight

Technologies are organs of the human brain, created bythe human hand; the
power of knowledge, objectified.

Bearing protocols’ human origination in mind, there is no reason why they
must be reified into a form of abstract control when they can also be
considered the solution to a set of problems faced by individuals within
particular historical circumstances. If they now operate as abstract forms
of control, there is no reason why protocols could not also be abstract
forms of collectivity. Instead of hoping for an exodus from protocols by
virtue of art, perhaps one could inspect the motivations, finances, and
structure of the human agents that create them in order to gain a more
strategic vantage point. Some of these are hackers, while others are
government bureaucrats or representatives of corporations – although it
would seem that hackers usually create the protocols that actually work and
gain widespread success. To the extent that those protocols are accepted,
this class that I dub the ‘immaterial aristocracy’ governs the net. It
behoves us to inspect the concept of digital sovereignty in order to
discover which precise body or bodies have control over it.”
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.hackerspaces.org/pipermail/discuss/attachments/20101002/94a0eca8/attachment-0001.htm>

More information about the Discuss mailing list