Phase Vocoder
   HOME
*





Phase Vocoder
A phase vocoder is a type of vocoder-purposed algorithm which can interpolate information present in the frequency and time domains of audio signals by using phase information extracted from a frequency transform. The computer algorithm allows frequency-domain modifications to a digital sound file (typically time expansion/compression and pitch shifting). At the heart of the phase vocoder is the short-time Fourier transform (STFT), typically coded using fast Fourier transforms. The STFT converts a time domain representation of sound into a time-frequency representation (the "analysis" phase), allowing modifications to the amplitudes or phases of specific frequency components of the sound, before resynthesis of the time-frequency domain representation into the time domain by the inverse STFT. The time evolution of the resynthesized sound can be changed by means of modifying the time position of the STFT frames prior to the resynthesis operation allowing for time-scale modification ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Vocoder
A vocoder (, a portmanteau of ''voice'' and ''encoder'') is a category of speech coding that analyzes and synthesizes the human voice signal for audio data compression, multiplexing, voice encryption or voice transformation. The vocoder was invented in 1938 by Homer Dudley at Bell Labs as a means of synthesizing human speech. This work was developed into the channel vocoder which was used as a voice codec for telecommunications for speech coding to conserve bandwidth in transmission. By encrypting the control signals, voice transmission can be secured against interception. Its primary use in this fashion is for secure radio communication. The advantage of this method of encryption is that none of the original signal is sent, only envelopes of the bandpass filters. The receiving unit needs to be set up in the same filter configuration to re-synthesize a version of the original signal spectrum. The vocoder has also been used extensively as an electronic musical instrument. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


IEEE Transactions On Speech And Audio Processing
The Institute of Electrical and Electronics Engineers (IEEE) is a 501(c)(3) professional association for electronic engineering and electrical engineering (and associated disciplines) with its corporate office in New York City and its operations center in Piscataway, New Jersey. The mission of the IEEE is ''advancing technology for the benefit of humanity''. The IEEE was formed from the amalgamation of the American Institute of Electrical Engineers and the Institute of Radio Engineers in 1963. Due to its expansion of scope into so many related fields, it is simply referred to by the letters I-E-E-E (pronounced I-triple-E), except on legal business documents. , it is the world's largest association of technical professionals with more than 423,000 members in over 160 countries around the world. Its objectives are the educational and technical advancement of electrical and electronic engineering, telecommunications, computer engineering and similar disciplines. History Orig ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Vocoder
A vocoder (, a portmanteau of ''voice'' and ''encoder'') is a category of speech coding that analyzes and synthesizes the human voice signal for audio data compression, multiplexing, voice encryption or voice transformation. The vocoder was invented in 1938 by Homer Dudley at Bell Labs as a means of synthesizing human speech. This work was developed into the channel vocoder which was used as a voice codec for telecommunications for speech coding to conserve bandwidth in transmission. By encrypting the control signals, voice transmission can be secured against interception. Its primary use in this fashion is for secure radio communication. The advantage of this method of encryption is that none of the original signal is sent, only envelopes of the bandpass filters. The receiving unit needs to be set up in the same filter configuration to re-synthesize a version of the original signal spectrum. The vocoder has also been used extensively as an electronic musical instrument. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Audio Time Stretching And Pitch Scaling
Time stretching is the process of changing the speed or duration of an audio signal without affecting its pitch. Pitch scaling is the opposite: the process of changing the pitch without affecting the speed. Pitch shift is pitch scaling implemented in an effects unit and intended for live performance. Pitch control is a simpler process which affects pitch and speed simultaneously by slowing down or speeding up a recording. These processes are often used to match the pitches and tempos of two pre-recorded clips for mixing when the clips cannot be reperformed or resampled. Time stretching is often used to adjust radio commercials and the audio of television advertisements to fit exactly into the 30 or 60 seconds available. It can be used to conform longer material to a designated time slot, such as a 1-hour broadcast. Resampling The simplest way to change the duration or pitch of an audio recording is to change the playback speed. For a digital audio recording, this can be accomp ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Curtis Roads
Curtis Roads (born May 9, 1951) is an American composer, author and computer programmer. He composes electronic and electroacoustic music, specializing in granular and pulsar synthesis. Career and music Born in Cleveland, Ohio, Roads studied composition at the California Institute of the Arts and the University of California San Diego. He is former chair and current vice chair of the Media Arts and Technology Program at the University of California, Santa Barbara.MAT: Faculty and Researchers
", ''Mat.UCSB.edu''.
He has previously taught at the "Federico II",

JoAnn Kuchera-Morin
JoAnn Kuchera-Morin (born 1951) is a professor of Media Arts & Technology and of Music." A composer and researcher specializing in multimodal interaction, she is the Creator and Director of the AlloSphere at the California NanoSystems Institute and the Creator and Director of the Center for Research in Electronic Art Technology (CREATE) at the University of California, Santa Barbara. Kuchera-Morin initiated and was Chief Scientist of the University of California Digital Media Innovation Program (DiMI) from 1998 to 2003. The culmination of Kuchera-Morin’s creativity and research is the AlloSphere instrument, a 30-foot diameter, 3-story high metal sphere inside an echo-free cube, designed for immersive, interactive scientific and artistic investigation of multi-dimensional data sets. Scientifically, the AlloSphere is an instrument for gaining insight and developing bodily intuition about environments into which the body cannot venture—abstract higher-dimensional information spa ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Roger Reynolds
Roger Lee Reynolds (born July 18, 1934) is a Pulitzer prize-winning American composer. He is known for his capacity to integrate diverse ideas and resources, and for the seamless blending of traditional musical sounds with those newly enabled by technology. Beyond composition, his contributions to musical life include mentorship, algorithmic design, engagement with psychoacoustics, writing books and articles, and festival organization. During his early career, Reynolds worked in Europe and Asia, returning to the US in 1969 to accept an appointment in the music department at the University of California, San Diego. His leadership there established it as a state of the art facility – in parallel with Stanford, IRCAM, and MIT – a center for composition and computer music exploration. Reynolds won early recognition with Fulbright, Guggenheim, National Endowment for the Arts, and National Institute of Arts and Letters awards. In 1989, he was awarded the Pulitzer Prize for a string ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Transfigured Wind
Transfiguration(s) or The Transfiguration may refer to: Religion * Transfiguration of Jesus, an event in the Bible * Feast of the Transfiguration, a Christian holiday celebrating the Transfiguration of Jesus * Transfiguration (religion), a momentary transformation of a person into some aspect of the divine Paintings * ''Transfiguration'' (Bellini, Venice), c. 1454–1460 * ''Transfiguration of Christ'' (Bellini), c. 1480 * ''Transfiguration'' (Lotto), c. 1510—1512 * ''Transfiguration Altarpiece'' (Perugino), 1517 * ''Transfiguration'' (Pordenone), c. 1515–1516 * ''Transfiguration'' (Raphael), c. 1516–1520 * ''Transfiguration'' (Rubens), 1604–1605 * ''Transfiguration'' (Savoldo), c. 1530 Film and television * ''The Transfiguration'' (film), a 2016 American film * Transfiguration (Harry Potter), a subject taught at Hogwarts in ''Harry Potter'' media * " Transfigurations", a 1990 episode of ''Star Trek: The Next Generation'' Literature * ''Transfigurations'' (n ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Vox Cycle
''Vox Cycle'' is a six composition or independent movement cycle for four amplified voices, and electroacoustic music by Trevor Wishart, composed between 1980 and 1988, associated with extended vocal techniques and the contemporary vocal composition. The Cycle is focused on the relationship and the interpolation between natural sounds and human voice, the main musical interest of the composer on which he has been researching for a long time, starting from ''Red Bird'' composition released through analog means. The poetics at the base of the work has linguistic and philosophical marks, regarding the relationship between creation and disintegration of man, between natural developments and failure of western culture and society. ''The Raw and the Cooked'' by Claude Lévi-Strauss suggested to the composer this central idea for his album. All the movements included in the original record are performed by the Electric Phoenix ensemble, using the extended vocal techniques foll ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Trevor Wishart
Trevor Wishart (born 11 October 1946) is an English composer, based in York. Wishart has contributed to composing with digital audio media, both fixed and interactive. He has also written extensively on the topic of what he terms " sonic art", and contributed to the design and implementation of software tools used in the creation of digital music; notably, the Composers Desktop Project. Wishart was born in Leeds, West Riding of Yorkshire. He was educated at the University of Oxford (BA 1968), the University of Nottingham (MA 1969), and the University of York (PhD 1973). Although mainly a freelance composer, he holds an honorary position at the University of York. He was appointed as composer-in-residence at the University of Durham in 2006, and then at the University of Oxford Faculty of Music in 2010–11, supported by the Leverhulme Trust. Music Wishart's compositional interests deal mainly with the human voice, in particular with the transformation of it and the interpol ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Ircam
IRCAM (French: ''Ircam, '', English: Institute for Research and Coordination in Acoustics/Music) is a French institute dedicated to the research of music and sound, especially in the fields of avant garde and electro-acoustical art music. It is situated next to, and is organisationally linked with, the Centre Pompidou in Paris. The extension of the building was designed by Renzo Piano and Richard Rogers. Much of the institute is located underground, beneath the fountain to the east of the buildings. A centre for musical research Several concepts for electronic music and audio processing have emerged at IRCAM. John Chowning pioneered work on FM synthesis at IRCAM, and Miller Puckette originally wrote Max at IRCAM in the mid-1980s, which would become the real-time audio processing graphical programming environment Max/MSP. Max/MSP has subsequently become a widely used tool in electroacoustic music. Many of the techniques associated with spectralism, such as analyses based on ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Spectral Leakage
The Fourier transform of a function of time, s(t), is a complex-valued function of frequency, S(f), often referred to as a frequency spectrum. Any linear time-invariant operation on s(t) produces a new spectrum of the form H(f)•S(f), which changes the relative magnitudes and/or angles (phase) of the non-zero values of S(f). Any other type of operation creates new frequency components that may be referred to as spectral leakage in the broadest sense. Sampling, for instance, produces leakage, which we call ''aliases'' of the original spectral component. For Fourier transform purposes, sampling is modeled as a product between s(t) and a Dirac comb function. The spectrum of a product is the convolution between S(f) and another function, which inevitably creates the new frequency components. But the term 'leakage' usually refers to the effect of ''windowing'', which is the product of s(t) with a different kind of function, the window function. Window functions happen to have fi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]