Skip Nav Destination
Close Modal
Update search
NARROW
Date
Availability
1-20 of 2023
Cultural Studies
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.001.0001
EISBN: 9780262368810
How East German artists made their country's experimental art scene a form of (counter) public life. Experimental artists in the final years of the German Democratic Republic did not practice their art in the shadows, on the margins, hiding away from the Stasi's prying eyes. In fact, as Sara Blaylock shows, many cultivated a critical influence over the very bureaucracies meant to keep them in line, undermining state authority through forthright rather than covert projects. In Parallel Public , Blaylock describes how some East German artists made their country's experimental art scene a form of (counter) public life, creating an alternative to the crumbling collective underpinnings of the state. Blaylock examines the work of artists who used body-based practices—including performance, film, and photography—to create new vocabularies of representation, sharing their projects through independent networks of dissemination and display. From the collective films and fashion shows of Erfurt's Women Artists Group, which fused art with feminist political action, to Gino Hahnemann, the queer filmmaker and poet who set nudes alight in city parks, these creators were as bold in their ventures as they were indifferent to state power. Parallel Public is the first work of its kind on experimental art in East Germany to be written in English. Blaylock draws on extensive interviews with artists, art historians, and organizers; artist-made publications; official reports from the Union of Fine Artists; and Stasi surveillance records. As she recounts the role culture played in the GDR's rapid decline, she reveals East German artists as dissenters and witnesses, citizens and agents, their work both antidote to and diagnosis of a weakening state.
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0001
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0002
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0003
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0004
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0005
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0006
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0007
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0008
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0009
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0010
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0011
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0012
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0013
EISBN: 9780262368810
Publisher: The MIT Press
Published: 01 March 2022
DOI: 10.7551/mitpress/12748.003.0014
EISBN: 9780262368810
Publisher: The MIT Press
Published: 02 November 2021
DOI: 10.7551/mitpress/14050.001.0001
EISBN: 9780262367264
How big data and machine learning encode discrimination and create agitated clusters of comforting rage. In Discriminating Data , Wendy Hui Kyong Chun reveals how polarization is a goal—not an error—within big data and machine learning. These methods, she argues, encode segregation, eugenics, and identity politics through their default assumptions and conditions. Correlation, which grounds big data's predictive potential, stems from twentieth-century eugenic attempts to “breed” a better future. Recommender systems foster angry clusters of sameness through homophily. Users are “trained” to become authentically predictable via a politics and technology of recognition. Machine learning and data analytics thus seek to disrupt the future by making disruption impossible. Chun, who has a background in systems design engineering as well as media studies and cultural theory, explains that although machine learning algorithms may not officially include race as a category, they embed whiteness as a default. Facial recognition technology, for example, relies on the faces of Hollywood celebrities and university undergraduates—groups not famous for their diversity. Homophily emerged as a concept to describe white U.S. resident attitudes to living in biracial yet segregated public housing. Predictive policing technology deploys models trained on studies of predominantly underserved neighborhoods. Trained on selected and often discriminatory or dirty data, these algorithms are only validated if they mirror this data. How can we release ourselves from the vice-like grip of discriminatory data? Chun calls for alternative algorithms, defaults, and interdisciplinary coalitions in order to desegregate networks and foster a more democratic big data.
Publisher: The MIT Press
Published: 02 November 2021
DOI: 10.7551/mitpress/14050.003.0001
EISBN: 9780262367264
Publisher: The MIT Press
Published: 02 November 2021
DOI: 10.7551/mitpress/14050.003.0002
EISBN: 9780262367264
Publisher: The MIT Press
Published: 02 November 2021
DOI: 10.7551/mitpress/14050.003.0003
EISBN: 9780262367264
Publisher: The MIT Press
Published: 02 November 2021
DOI: 10.7551/mitpress/14050.003.0004
EISBN: 9780262367264