CLICKSPACE:
VISUALIZING COMPUTER PROCESSES THROUGH TIME-LAPSE REPRESENTATION
Jonah Marrs
Abstract
In “Translations between Drawing and Building,” Robin Evans describes the “blind spot between the drawing and its object” as a forgotten site in the contemporary digital work flow.[1] Evans argues that this “gap” used to offer productive friction between representation types (such as sketches and physical models) in a predigital design process. In today’s design environment, however, this site has become a kind of frictionless plane, in which representations of objects can be transformed directly into 3D artifacts by automated processes that take place “behind or below the threshold of perception.”[2]
This research attempts to recapture some of the lost friction that Evans lamented in the current digital design environment and takes an initial step to reclaim this productive territory for designers. To that end, this investigation explores the moment when we click “print” and automatically translate a digital representation into a physical object through a laser cutter, 3D printer, or another Fab Lab CNC technology toolchain. I call this translation moment “clickspace” to reference the act of clicking of a computer mouse that begins this process and to highlight the speed at which this process is undertaken. In its exploration of clickspace, this research gives formal expression to behind-the-scenes proprietary algorithms (such as the “traveling salesman” path solvers in CAM software and the STL-creating algorithms in 3D-slicing software) that operate at crucial translation moments in modern-day digital workflows. A technique of reintroducing the element of time into these instantaneous computer processes, “exploded” or “time-lapsed” images and models are used as didactic representation methods to illustrate clickspace.
1. Clickspace Part I
1. 1 Background
In a traditional CAM/CNC plotter or laser-cutter toolchain, a CAM algorithm first converts the image vectors in our file into a sequence of toolpaths: a series of lines and arcs transcribed in a language called G-Code.[3] This sequence of toolpaths is determined by an algorithm, which calculates the most efficient path based on minimizing the distance traveled by the tool. (This path-finding task is referred to as the “traveling salesman problem” in the science of optimization.) Once a sequence of tool operations is optimized, a piece of software sends toolpath commands to an external CNC machine, which then receives the commands and executes them one by one, as predetermined by the machine’s firmware settings, by sending signals to activate appropriate motor drivers. These moments of translation, between the 2D vector image on our screen and the machined object, are the topic of the first set of experiments described in this paper.
1. 2 Clickspace 2D Experiments
To investigate the clickspace of the plotter / laser-cutter toolchain, a custom CAM/CNC plotter toolchain was developed. First, a sequencing algorithm was devised in Grasshopper to allow novel toolpath sequences to be created from an input vector drawing. Instead of optimizing for the shortest run time, these algorithms sequence paths based on user-defined priorities, such as shortest to longest line, leftmost to rightmost line, etc. Next, a custom CNC drawing machine was devised to record the sequence of CNC machine movements as choreographed by the CAM software. With this device, the CAM drawing sequences could be “stretched” to varying degrees in time to reveal the sequence of operations undertaken by the CNC machine.
This custom CAM/CNC plotter toolchain allowed for experimentation with three main parameters: the input CAD drawing, the CAM vector sequence, and the degree of CNC actuation recording in time. Adjusting the parameters and comparing the resultant drawing to the input image revealed the range of transformational potential in clickspace that can be prompted by the designer.
2. Clickspace 3D Experiments
2. 1 Background
This second foray into clickspace attempts to visualize the processes undertaken by STL-creation algorithms and to describe the formal implications of their translation process on input files. In a typical 3D-prototyping workflow, the user loads a 3D model into a slicing program, which generates commands for a rapid prototyping machine to make a 3D print. To achieve this, slicing programs first take our 3D model and translate it into a Stereolithography (STL) or comparable file format. These file formats were devised to describe objects in triangles or quadrilaterals, which are far easier to divide into layers than curved surfaces.[4] A second computer process now slices the translated model built of triangles to produce a stack of polygons, which represent the outline at varying Z-heights of the object being prototyped. These outlines are subsequently filled—if a solid object is desired—and finally a series of G-code instructions is produced for the 3D printer.
2. 2 Clickspace 3D
Algorithms that generate STL files and convert curves into polygonal meshes follow these steps:
- Place points on all of the shared edges of all of the surfaces of the input model (the number of triangles is often dependent upon user-defined settings).
- Create triangles on each surface while ensuring that each corner is coincident with at least one other corner.[5]
To visualize the sequence in which the STL algorithm has translated a 3D object into a series of triangles, a simple processing script was devised. The script adds a successively larger Z-height to triangle vertices in the sequence in which they were written to an STL file by the STL-creating algorithm. The result is a new STL file that has been “stretched” upwards, revealing the order in which the triangles were created by the algorithm while maintaining their X and Y positions.
3. 1 Clickspace Experiments in 2D and 3D Conclusions:
Algorithms, like the “traveling salesman” path solvers in CAM software or the STL-creating algorithms in 3D modeling and slicing software, take over complex translation moments in our design process. This research works under the conviction that as these algorithms operate in a sphere traditionally considered to be within the realm of architecture (namely, the translation of 3D and 2D forms across different media), and that the formal implications of these algorithms should be revealed and made accessible.
This research presents a method of mapping moments of translation of machine-machine interfaces with “exploded” or “time-lapsed” drawings and models. This technique of stretching along an axis reintroduces time into a nearly instantaneous computer process to reveal what is often invisible and to provide an instructive record of a transformation sequence. In contrast with traditional exploded building representations, which reveal construction components as we understand them, these exploded images and models reveal digital building components, such as lines and triangles, as the computer understands them.
Jonah Marrs is a Master’s of Science in Architecture Studies Candidate with the Computation Group at MIT. He has a background in History, Architecture and Electronics Design. He is interested in the intersection between Media Archaeology and 20th C Experimental Art. Jonah has worked as an Architect in Berlin, an electro-mechanical prototyper at the Brooklyn Navy Yard, a guest digital archival researcher at Montreal’s Canadian Center for Architecture, and an Artist in Residence at Autodesk’s Pier 9 in San Francisco.
NOTES
[1] Robin Evans, “Translations from Drawing to Building” in Translations from Drawing to Building and Other Essays ([Cambridge, MA]: [The MIT Press], 1997), 182.
[2] John May, “Field Notes from the Instruments Project,” [The Journal of Architectural Education] 69, no. 1 (Spring 2015): 59.
[3] Research on Clickspace Part I was conducted as part of the author’s M.Arch thesis at the University of Toronto during the spring 2015 term. Professors Laura Miller and John May served as thesis advisors.
[4] 3D Systems Inc., Stereolithography Interface Specification, October 1989.
[5] 3D Systems Inc., Stereolithography Interface Specification, October 1989.