It’s All in the Mix . . . and in the Master

That vintage vinyl record you place on your turntable is the result of creative and technical choices made along the way from song to disc. By the late 1960s, record making had become a lengthy process involving multi-track recording, mixing, and mastering. In the following selection from Chasing Sound: Technology, Culture, and the Art of Studio Recording from Edison to the LP (Johns Hopkins, 2013), Susan Schmidt Horning reveals how even at the mastering stage, recording engineers had the power to shape the sound of the final record.

Once the final mix was complete, the third stage of making a record involved cutting a master disc from the mixed tape. Before magnetic tape became the standard recording medium, this master disc was made at the time of recording, just as mixing was done during the recording. When tape became the primary medium, the final mixed tape became the master and the lacquer master cut from it became the master sent for processing to vinyl LP or single disc. To reduce noise inherent in the process, the mastering engineer boosted high frequencies and might also apply some equalization to bring up the sound of certain instruments, but this was the extent of what the mastering engineer was expected to do. Once multi-tracking introduced the post-mixing stage—that is, mixing after rather than during the recording—some of the postponing that took place in recording spilled over to the mastering stage.

Until 1954, when the Recording Industry Association of America (RIAA) established a standard recording curve, record companies had used a variety of different recording, or equalization curves in cutting a master disc. To avoid overcutting and reduce surface noise inherent in disc recording and playback, the RIAA curve reduced the bass and boosted high frequencies at the time of recording (the recording curve), and modern phonograph preamps included circuitry to reverse those changes at the time of playback (the playback curve). Even after the RIAA standard, Bill Stoddard said, “in the mastering room we did what sounded best!” In the late 1950s, the National Association of Broadcasters (NAB) specified disc-recording reference levels (7cm/s at 1 kHz for mono, 5 cm/s for stereo), but most mastering engineers used those as minimums, preferring to cut with as much level as they could get away with. As Columbia engineer Doug Pomeroy recalled, “It was never hard to find discs which were cut MUCH hotter.” Stoddard recalled that in the early 1960s there was one popular record that was “the loudest and hottest record ever. We had a copy of that record [The Isley Brothers’ “Twist and Shout,” Wand, 1962, mastered at Bell Sound], and would compare everything we cut to that record for level and apparent loudness.” Jack Wiener, mastering engineer at Universal Recording in Chicago, agreed that the level that the RIAA curve designated was considered by the major record labels to be a “hard and fast rule that thou shalt never break,” but which he consistently ignored, particularly because of the influence of record producer Randy Wood. Wood, who ran Dot Records, had a jukebox delivered to the mastering room at Universal Recording and instructed the mastering engineers to increase the level until they reached a point where the record would not track in the jukebox; then they would reduce the level just enough so that it would play. According to Stoddard, “Wood said he didn’t care if he had to eat some records . . . he just wanted [it] to be the loudest.”

Mastering “hot,” as it was called, seemed to be the goal of every record company in popular recording to make its records stand out when played on jukeboxes. As soon as Mitch Miller moved to Columbia Records from Mercury, Bill Savory recalled that Miller would come down to the engineering department and say to the engineers, “’Hey fellas, we’ll go to lunch today, this little deli has a jukebox, and I’ll bet you anything, you play any record on there and it’ll be louder than a Columbia record.’” Miller’s purpose was to get the engineering department to come up with a better method of mastering to make Columbia’s popular records louder. It could be done but not by adhering to the RIAA curve and usually at the cost of record length because louder passages required wider grooves, thus reducing the number a given disc could accommodate. Making hot masters that could still track on most phonographs (play without skipping) was the goal of every mastering engineer and rarely would the artists or producer be involved at this stage. In 1964 mastering engineer Clair Krepps received a tape from his client United Artists, with the request that he cut a master. As he listened to the tape, Krepps recognized it as a song he had just mastered the previous year, “Do Wah Diddy Diddy,” a song by Jeff Barry and Ellie Greenwich that The Exciters recorded in 1963. That record had not done all that well, and since the tape of the new version by Manfred Mann did not sound any better to Krepps, he called the company and asked why they wanted to put it out. Claiming contractual commitments to the band, United Artists told Krepps, “‘Look, do any damn thing you want with that record, we don’t care!’ Krepps recalled. “So I started playing around with it. And I got the idea to shock the industry, so I used some equipment that my brother and I designed, and I had undoubtedly, I made the loudest record ever made–45 it was. And pretty soon, other engineers, once it became a hit, they called me and complained, ‘What the hell are you doing Clair, you know better than this!’” Because no producer or artist was looking over his shoulder, giving direction on how the record should be mastered, and the record company did not care, Krepps was able to experiment. When the record hit the top of the charts in England, Canada and the United States, of course, no one complained, and it seemed to reinforce the idea that hot masters made hit records.

Few mastering engineers, however, could enjoy that kind of freedom as artists and producers increasingly stipulated that their masters be cut a certain way. Grundy recalled that some producers would send their tapes to be mastered with explicit instructions like “The first four bars on the left track of song two are okay, but the next four bars require a boost in the high end.” The engineer was then expected to make changes during the actual mastering process, sometimes adding reverb or changing equalization. Grundy said that these instructions “began to grow to absurdity,” to the point where it ultimately became necessary for studios to put in a duplication of all of the signal-processing devices: the equalizers, limiters, compressors, phasers, all of the things that might be used to affect the sound of the music in the mastering process, and each of these would be available on two different systems, A and B. As Grundy explained, “The left and right channel of the first song on the record would have certain requirements for EQ and whatever. Well those are set up on the A channel, the left and right A; and then the requirements for the second song are on the B, [to avoid] switching instantly during the spiral from [band one to band two]. Then while B is being cut, like a three minute song, you’ve got to set up A again for the third band, and then switch back to A during the spiral between two and three.” So where the mastering engineer once could set his equalization curve, making slight adjustments from one song to the next while inspecting the grooves, he now had to do double-duty to set up one song and then switch to the second channel quickly while the cutter spiraled to the next song. Here, too, the job of the mastering engineer grew more complex, even more creative as a result of multi-tracking and post-mixing, and this led to the rise of independent mastering engineers and studios that did nothing but mastering. And certain mastering engineers became highly valued for their ability to do this effectively. Bob Ludwig and Bernie Grundman in New York made their reputations by being able “to take poorly prepared master tapes and produce a master disc that was amazing in the corrections that they were able to implement,” Grundy recalled. “And, of course, everyone wanted Bob to do their record because he could make a silk purse out of a sow’s ear.”

Some mastering engineers refused to respond to these requests. When Bill Stoddard was a mastering engineer at Fine Recording in the late 1950s, Kapp Records was one of the studio’s biggest accounts. Kapp made so many requests in mastering that Bob Fine eventually set a limit on the number they could make. But Stoddard recalled getting instructions like “two more notches of bass on band three,” which just made no sense as far as he was concerned, so he would just cut it the way he wanted to. He figured, “The hell with it. Nobody ever said anything. This was a foregone conclusion—with Bob’s blessing too—it was just a way to get the thing out. The more sides you cut the more money you make.” Stoddard’s decision to ignore the client’s request was not driven by economic motives so much as it was his conviction that it was the mastering technician’s job to use his own good judgment to make a record as close to the tape as possible, and it was the client’s job to supply an edited tape ready for mastering. But this trend toward expecting more out of the mastering process only escalated during the 1960s. Eventually, the decline in the number of ready masters and the increase in the amount of those that were passed on to the mastering stage for further adjustment led to the rise of mastering as a specialized field. Since engineers who began after the advent of tape were not required to learn disc cutting to record, the disc-cutting skills required for mastering became the one area of recording that retained “the most trade secrets,” and as those trained during the era of disc recording began to retire, new engineers could only learn from the technicians whose tacit knowledge and skill had been gleaned over decades. By the 1970s, mastering was no longer simply the transferring of a recording from tape to master disc, it was considered the last creative step and the first manufacturing step in the record-making process.

Comments are Closed