Everything in the studio starts and ends with the musician. From the performance, to hopefully collecting a check. Musicians rely on the principles of Audio Engineering to record sound from analog to digital.
Related Articles
The music starts as a sound captured from the microphone or DI box, or as a VST or midi instrument inside a instrument or computer. The overall volume is measured in dB or decibels. While the sound may be described in various ways such as frequency, distortion, and many other qualities all can be useful to describing sound and how the human ear hears, and brain perceives sound.
The acoustics of the room you record in, and the types of gear you use affect the way the sound acts in different ways. In addition, the speaker placement, microphone placement, the effects and plugins used during the mixing process, the mastering, and resolution and file type recorded and rendered will affect the sound, other uncontrollable factors like playback system, and how the listener has the EQ set on their playback system will affect how the sound as well.
While it may seem that bad sound is self evident, sometimes ignorance, or just not knowing how gear works comes into play.
[contact-form-7]The post Intro to Audio Engineering for Musicians and Songwriters appeared first on Bad Racket Recording: Cleveland Recording Studios.
This post first appeared on Bad Racket: Cleveland Recording Studios & Video Pr, please read the originial post: here