![]() ![]() Phaser effects process a sound by sending it through electronic components that slightly delay different parts of the audio signal. And to show a sound is coming from an electronic circuit, what better way than to make it a circuit that’s malfunctioning and distorting? Early ideas of artificial beings would have mirrored the speakers used in consumer products like radios and televisions. A robotic voice doesn’t need to sound just like a humans, so having a very low or high pitch is acceptable, giving us a sense of a mechanical being rather than one made of flesh and bones.ĭistortion is an effect you can link to the robot. ![]() Pitch is a factor played with in the history of robot voices, using a harmonizer that was originally invented for musical applications, you could get a ‘double’ voice sound, thickening the audio to make it sound overlapped and phasey. So how did early sound engineers make voices sound less human? ![]() It’s interesting to hear the answer to the question “What does a computer sound like?” in early representations of robot voices from the silver screen. Real human voice recordings were piped through strange effects to make them sound less human, and often more metallic. So when the idea of the humanoid robot was taking shape in past decades, people had to work with what they had. Even now, the robots of the movies don’t exist (although we wake up checking the news every day to see if the breakfast droid has been invented yet). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |