Site icon Design Educators Community

CDP: Creative, Live, and Co-Coding

This conversation was held on 21 December 2021. It was edited for clarification and additional commentary was inserted for emphasis.

Ted Davis is a lecturer of interaction design for undergraduate and graduate levels and coordinator of the UIC/HGK International Master of Design program within the Institute for Digital Communication Environments (IDCE, formerly Visual Communication Institute) at The Basel School of Design FHNW HGK in Switzerland. Ted has authored P5LIVE, a web-based collaborative live coding environment for p5.js, and co-authored basil.js, a Javascript-based scripting plugin for Adobe InDesign. His practice and instruction explore the volatility of digital media through Glitch and reactivation of older ‘new media’ using newer programming means.


Kyuha Shim (QS)
Creative coding enables us to design in a generative way, in which form is continuously constructed and reconstructed upon the interaction between input parameters and algorithms. What is it like using code in a designerly way? Can you tell us about what you teach at The Basel School of Design?

Ted Davis (TD)
I am teaching code in the context of interaction design from first-year undergraduate to master levels and introducing HTML, CSS, JavaScript (including basil.js and p5.js), and Processing. With the first-year students, for the last couple of years, I’ve been assigning re-coding projects. They choose a historical computational artwork (most often from compArt database), then try to recreate it, figuring out how the math and logic work. It starts with figuring out how to recreate it with code as close as possible. They learn the syntax and concepts of code such as logic, loops, functions, and OOP (object-oriented programming). For the brave, they then add audio reactivity or graphic user interface elements to customize their project and take advantage of the fact it’s real-time, compared to the limitations of a plotter or so back when it was first created. This past semester, we’re exploring recently digitized films from the Basel School of Design’s own Film + Design course archive. This brings the added challenge of trying to recreate the design of motion too.


Reactive Recode. © 2021, Basil Mayer + Vivien Pöhls.


QS
Learning the syntax and concepts of code by way of re-coding sounds like a fun and stress-free way to learn computation. Challenging students to deconstruct visual structures and reconstruct them using programming as practice would teach students to acquire what I call a computational “lens:” a way of seeing form as visual patterns that can be built upon logic. This process sounds similar to reverse engineering. I imagine that the intent is not to simply replicate the original work, but to study the visual principles of form. Essentially, students are learning how to view and think about form and formation from systemic perspectives. Once they are exposed to computational design foundations in this way, what projects would your students work on next? Or, in which design contexts would they use code?

TD
It varies. It’s always a question at the beginning, especially with first-year students, what could they do with it? It’s tricky to answer because they could do anything once they’ve learned code! This year with the masters students, we are doing for the second time what I call p5-tools, which is building mini tools with p5.js. The term “tool” is left very open to interpretation, but generally includes a graphical user interface (GUI). This edition (p5-t00ls) includes projects such as poster generators, VJ tools, 3d visualizations to display posters in a virtual space. Some are making a virtual grocery checkout that acts as a rhythm drum machine. Others include a rainbow generating tool, yet another is a salad generating tool to explore recipes. Through the project, the students learn to build their own tools with code. This starts by exploiting a discovery in code or defining what you want to do, breaking it down into sequential steps, starting simple then allowing the complexity to seep in. Maybe it’s only useful for oneself at the beginning, but later it could be useful for more people.


Type Motion 3d. © 2019, Zian Lu + Alexandre Nielbo.


Beyond teaching code, I also teach on the volatility of digital media. Every spring semester, I teach a course on Glitch, which was the main topic in my thesis work. Glitch offers fascinating ways to learn about how images and media function. Students open a digital image, find the head and the body of the image, and make changes with tools such as hex editors or code to manipulate it. It introduces what is possible in digital image-making. I walk them through the process of what I like to call ‘precise mishandling’ because you know exactly what you’re doing wrong and what you shouldn’t do, but do it very properly. It’s not reverse engineering but more like poking it, making the image break enough that it’s not totally broken. It enables the students to learn about a given media file format and what it’s capable of through those breaks and errors. This project also provides an opportunity for students to explore amongst the hundreds of different image file formats, designed for specialized purposes, as they have mainly dealt with the top five or so that everyone is using on the web.


109,026 Vandals in Helvetica. © 2017, Mijeong Jeong.


Vexillologlitch. © 2020, Aubrey Pohl.


Pixel Storms. © 2019, Lena Frei.


QS
Your Glitch assignment sounds fascinating because your students start with an image, so they are aware that they are manipulating an artifact using code from the beginning. I think such an exercise would, again, help the students perceive visuals as the rendition of logic generated by a computer. I have seen your Glitch work online. I am curious whether you had any challenges teaching it or doing it on your own.

TD
Glitch has been one of the primary aspects of my practice in the last 10-plus years, and for teaching, I have made tools that can help my students Glitch. However, every couple of years, those tools stop working when the students update their operating system (OS), especially once MacOS switched to 64-bit only software. Similar to the very first glitching web-based tools (TEXT2IMAGE, HEADerREMIX) I made during my thesis, I continue to appreciate how amazing the web browser is, as it doesn’t really matter which OS you are on. With HTML5 and the canvas element, it’s become powerful enough for real-time glitching of images, video, font files, anything.


p5.glitch romp using P5LIVE. © 2021, Ted Davis.



QS
It’s frustrating when codes and tools are deprecated. I’m thinking about Scriptographer, which hasn’t been updated since Adobe CS6. I agree that the Web is a relatively more sustainable environment in that sense. Speaking of which, I am a huge fan of your P5LIVE, which provides a really fun coding environment for multiple people to collaborate live. Could you tell us more about it, and what motivated you to build it?

TD
P5LIVE started from the Processing Community Day 2019 that we held here in Basel together with a handful of my former students, who initiated it. We formed a group called ‘basel.codes’ and organized workshops and talks. There were a couple of DJs amongst our group, so we wanted to have a party at the end of the full day of coding. And it sparked an idea that we need to have live coding, which provides immediate feedback: typing something and seeing the visual output almost at the same time. Previously, I had played with some live coding tools like Cyril built upon openFrameworks. I had considered using that tool a few years earlier during a course on audio-visuality, taught together with Stefanie Bräuer, but again encountered OS issues as the tool hadn’t been updated for some time. Later I learned about REPL mode for the Processing IDE, a special mode for doing updates without having to start over the sketch. But it’s still this sort of paradigm: you code, you hit play, and then you see your output. So you’d have to have two monitors, and you wouldn’t necessarily see the code. You would be able to have live visuals that you constantly change, but it would be kind of separated. I really wanted this overlapping algorave model so that we could see the process of coding and see the visuals full screen. Hydra convinced me that the web browser could support fullscreen, live-coded, visuals. I started playing just in the browser with a Textarea field, figuring out how to get it to compile into a full-screen sketch with p5.js, and then kept adding more features based on needs in our coding processes. P5LIVE provides instant feedback to the coding process. It’s as if the distance from putting your pencil on paper and moving became super close. And so it started out for performance like a VJ tool made for p5.js. And soon after I thought: we’re in the browser, so why not collaborate like Google Docs or Etherpad style? Using WebSockets made it possible for multiple people to edit the same code and compile the visuals locally on their machines.



QS
Great points. Like many other designers, computational designers face technical roadblocks but they find their way around it or through it by trial and error. They would most likely be the first test users of what they build, which would affect their own design processes. There is no end. We can continuously shape our tools and our processes.
I’ve always thought that one of the characteristics of computational processes was discreteness. Typically in many scripting environments, we write our code and then run it to see generated outcomes. P5LIVE has changed this and made the process of coding more participatory, real-time, and also inviting more spontaneity. Have you introduced P5LIVE in your classrooms?

TD
Yes, this collaborative mode became COCODING, which enabled jamming with people around the world or in the classroom. I would show it to my students with the intent that they collaborate while at home and often find them using it throughout the class session, while also sitting physically right next to each other. This was a ‘nice to have’ feature until it became a critical necessity once the pandemic hit. Last year was entirely virtual, so I only got to know my first-year students remotely. Occasionally, I’d ask the 26 students to jump into a large COCODING session where I could demonstrate concepts and ask for participation along the way. To prevent chaos and bugs from occurring, early on I had introduced a feature called “lockdown,” which had a different meaning when I made it. When activated, only the admin can edit, while each participant can request and have their editing rights toggled. Throughout remote-teaching, I still wanted to encourage partner projects, so there would be two brains working out code issues rather than struggling alone. To support this, I pre-defined the COCODING session IDs, posting them on our course website. Like this, we could meet as a class over video chat and on-demand, still offer individual help by just clicking on their COCODING session links.
Teaching code remotely was tricky, but I can only imagine how difficult it must have been to learn. From the teaching perspective, I received far fewer raised hands and requests for help than in the physical classroom. I attribute this to the juggling students had to do in watching both a shared screen feed, while following along in another window. Asking for help, meant potentially stopping the teacher’s feed, while having to ask it in front of the whole group, compared to raising your hand or tapping a peer on the shoulder. From a group of 26 students, there are usually 5 or 6 asking for help when introducing topics like for loops. In the remote-teaching landscape, I’m lucky to get a single question. This experience inspired applying for and receiving a University grant to extend COCODING into a split-screen view, so students could work alongside the teacher without the need for a second monitor. It would display the teacher’s code in real-time (and interactive) on the left side, while each student or pair could have their own room on the right side. To ask for help, students could raise a virtual hand, allowing anyone to jump into that room and lend support.
This new project has become COCODING Classroom, which started development in spring 2021. At any time, the teacher and students can jump between rooms to follow each other’s progress. This small feature quickly became analogous to sitting beside a student to see what they’re working on. This is also available with a timer for automatically switching, which really simulates walking around the room. In the first alpha-test workshop, students quickly learned from each other, asking one another about unique visuals they saw in a friend’s room. Upon sharing the name of said function, they both jumped to the peers’ room and explored implementing it there. In that first workshop, it became clear each room needed its own chat. In the gallery mode, everyone moves together with the teacher to look at one sketch at a time. In this mode, any changes to the code are only local, so everyone is encouraged to remix all values and learn how the code works. This view also has its own chat per sketch, to encourage written out critiques from those who feel more comfortable chatting than speaking. We’ll have the second workshop at the beginning of February and plan to release it open-source soon after (educators can sign up for early access). It’s being specifically created for remote teaching, but it will be great to see how it could be used in performance.


Remote-teaching course with my 26 BA students in both Webex and P5LIVE COCODING . © 2022, Ted Davis.


QS
Great points. Like many other designers, computational designers face technical roadblocks but they find their way around it or through it by trial and error. They would most likely be the first test users of what they build, which would affect their own design processes. There is no end. We can continuously shape our tools and our processes.
I’ve always thought that one of the characteristics of computational processes was discreteness. Typically in many scripting environments, we write our code and then run it to see generated outcomes. P5LIVE has changed this and made the process of coding more participatory, real-time, and also inviting more spontaneity. Have you introduced P5LIVE in your classrooms?

TD
I hope so! It was also fun during the alpha-testing workshop to see physically present students look at the projection screen, even though the code was present on the left side of their screen. It’s a mode of learning we’re so used to. In between technical inputs, the split-screen view can be pulled all the way to the left for focusing on their own space. If all goes well, they’ll occasionally have spontaneous visits from peers and share feedback with one another throughout the process.


COCODING Classroom workshop 01 . © 2021, Ted Davis.


QS
I am curious whether there are other faculty members in your Institute for Digital Communication Environments FHNW HGK teaching computational design or design that utilize code as a medium. If so, could you briefly tell us about your colleagues? How is your class connected with theirs?

TD
Teaching in our focus on interaction, which until recently was called ‘medium’, is Dirk Koy and Ludwig Zeller. Dirk’s work explores the boundaries of reality and the virtual, captured with a wide assortment of digital techniques and displayed in varying forms of moving image. His teaching focuses on time-based motion across video, poster, and typography. Ludwig’s work explores the relationship between technologies and culture with an emphasis on the narrative qualities of fictional design artifacts and speculative scenarios. His recent courses focused on beyond images with machine learning and augmented realities. I’ve been teaching the introduction to creative coding for our first-year BA students, who then ideally continue to develop those skills in their specialized topics. Our institute is currently developing two tracks within the BA program, Visual Communication, and Digital Spaces. I’m looking forward to how this new focus will develop and where our courses will intersect in the near future.


Circle 02 . © 2019, Dirk Koy.


«Bildwelten». © 2016, Nils Dobberstein.
FHNW HGK IDCE Moving Posters course taught by Dirk Koy


Life Is Good For Now . © 2015, Ludwig Zeller.


Urban Build-Up . © 2019, Tim Levi Keller.
Augmented Realities course taught by Ludwig Zeller.


QS
Addressing computational technologies related to emerging time-based and speculative design practices sounds compelling! It is important to expose students to various contexts that computational systems play roles in design. Lastly, do you have thoughts or comments you’d like to share about computational design practices?

TD
I look forward to the next two decades in this field and what creative surprises it’ll bring. With so many open-source frameworks, libraries, and building blocks at our disposal, it’s hard to imagine needing anything else. Nevertheless with initiatives to make code more accessible, inclusive, and introduced earlier in education, we’ll soon be experiencing new visions for what computational design practices can entail.

Exit mobile version