YamahaSynth.com Forums

This is the place to talk about all things related to Yamaha Synthesizers!
  1. Tyron
  2. Sherlock Holmes The Voice
  4. Sunday, 21 August 2016
Hi BM,

I'm getting comfortable enough to start exploring and have tried a little sound design using the AWM2 however I am not sure I completely understand the methodology of the Montage (does anyone apart from the designers :). I am suing Cubase 8.5 pro.

So I am attempting to build a performance and so I initialize the Montage and in Part one, element 1 I select a wave OB (synth lead). It sounds nice and phat.

I then think Ill add a few more of these wave forms via the other part elements and detune them a little to create a thick sound. (synth programming 101)

Normally at this stage I would then shape the waves with a filter(s) to the timbre of my liking and add my effects however I can't see a filter on the main Common page. So I am wondering if I would need to apply filters individually for each element I have probably missed something like the Montage having 8 'voices' per element therefore negating the need to add additional elements to do what I am used to or that there is a main filter I have over looked. Please could you advise how I would use a single filter to shape simultaneously the output of 8 elements (if possible) or a work around.

Responses (4)
Bad Mister
Accepted Answer Pending Moderation
In the AWM2 sample playback engine, each "Element" is a complete sound engine. It can be used alone, or as is often the case, it can be used combined with other Elements to build a single sound, or multiple sounds.

Each Element contains a Waveform (made up of as many as 256 samples); each Element has its own Pitch EG, a Filter with its own FEG, an Amplifier with its own AEG. Each Element can standalone as a completed synth sound... It includes a full compliment of component blocks...

To fully understand the power of a single Element, recall the Performance: "Ens Mix"
This is a full orchestral stereo single Element sound. The Oscillator is the stereo orchestral sample, made from stereo samples of Contrabass, Cellos, Violas, and Violins; explore the components of this single Element sound. You don't need to build this from separate components, here it is in a single Element.

So to answer your question, you don't send multiple oscillators through a single filter... Here each Element (component) has its own fully controllable filter. In this, the sample playback engine goes beyond its analog forerunner (where all the oscillators are forced through usually the one analog filter available). In a Montage Part you can have eight Oscillators, eight filters, eight amplifiers.. All individually addressable. This is helpful in detailing sounds that change dramatically with velocity. The Pitch-Filter-Amplifier blocks in the AWM2 engine are individual programmable as to how velocity impacts their envelopes. Each Element has its own LFO. That's in addition to the Part LFO.

Now, on the synthesizer side of the Montage, in FM-X the architecture is flipped around a bit... There are eight Operators in the FM-X engine, each is capable of outputting a frequency - it can be used as an audible source when in our range of hearing (Carrier) or it can be used as an inaudible rate which is used to modulate a Carrier (Modulator)... You will often hear it said Operators are like FM's Oscillators, but the big difference is it is not just a tone generator, it has a built-in Amplitude Envelope Generator that shapes its output.

Instead of simply doubling and detuning - try a slightly different variation of the waveform, you are often given samples with different phase to the original. Try a different filter setting to give the sound some character. If you make it a detuned mirror image it might be too perfect... Programming is all about trying things and personal preference. Look for already combined wave samples - if looking for the three oscillator sound, look see if there already exists a sample of three oscillators as a single Waveform!

In the FM-X engine the Filter is applied more like in the analog synthesizer paradigm; that is, all the Oscillators (all 8 Operators) are combined together to go through the one Filter for the Part. It is not really a limitation just a different method of synthesis. The more filters you have access to the more you can actually detail the sound's response. In FM-X that detailing is done with Modulator's and the output Index as applied to a Carrier... The varying of the Modulator output causes more/less harmonic change in the resulting sound. But in a different manner from a subtractive filtering of harmonics.
  1. more than a month ago
  3. # 1
Accepted Answer Pending Moderation
Thanks BM,

This type of synthesis is very much the creation of Japanese scientists in laboratory coats who are very proud of the way they have split up the traditional synthesis paradigm under the guise of increased power and therefore increased instances of sound design. Sometimes too much of a thing is no good.

To my understanding each element can be viewed as a macrosynth with amp env , pitch env etc which on paper is a wonderful thing however with a touch screen and jog wheel it becomes a very unsexy way of synth programing. Each aspect of the routing requires menu diving and so when you wish to use all 8 elements, each element will require tweaking thus translating to a lot of menu diving and jog wheel turning.

Selecting waveforms has to be done through the touch screen but there are no wave graphics displays so that we can gain some idea of how they might sound. There are thousands of waveforms from which to choose and build sounds around. This working methodology would mean auditioning individual waves (there are thousands) until you find something you like. In practice I have found this menu diving, scroll wheel turning to be less than inspiring when it comes to being creative. It’s very pedantic and not as intuitive as I’m used to.

I understand that the macro level element allows for the creation of realistic instruments performances but for a modern music producer who needs ease of access to the creation of useable synthesized timbres the lure of realistic, real life instrument creation is not the main focus.

The Montage is extremely powerful and this power would really benefit from being managed (when required) in a traditional sound design way. Being able to route 8 elements to a series of powerful filters for overall shape manipulation would make this ‘synthesiser’ a real beast in the hands of sound designers. I get that the genius folk of Yamaha have done something quite different with the Montage but as my uni professor once said that it’s no good being a genius if your plans cannot be communicated. Here we have a really powerful synth that in my humble opinion has sacrificed an intuitive workflow to implement a new way of sound synthesis that isn’t intuitive or fun. It’s particularly confounding because in 2016 when massive analog beasts like Dave Smith and Oberhiem ship with VST plugin’s that integrate their standalone boxes within a DAW we are menu diving with the Montage!! The Montage forces several working changes on the user under the premise that it’s ‘an advanced machine’ that we don’t understand yet and so ultimately the issue lies with the consumers inappropriate understanding of the hardware. Nice!

I would like this message to get to the Yamaha Marketing team. If as a prospective buyer of a montage your current studio set up is a laptop, a guitar (other instrument) and a Mic then it is quite feasible for the Montage to be your main audio interface. If however you have several pieces of outboard gear such as hardware processors, external synths etc then it is not at all appropriate to use the Montage as your main audio interface. The main reason for this is that there are not enough physical inputs to hook up external gear. There is also no ADAT. This translates to fractious sessions where by you will need to stop what you are doing remove one hardware connection and then add another in its place. You may then have to menu dive to configure the levels and routing again and so it’s back to the touch screen jog wheel negotiating. This for me, takes away from the creative process.

For many studio users of the Montage I would suggest to Yamaha that a good work around in a future update would be for the Montage to route 4 channels of Audio via USB within a DAW set up without having to use the Montage as the main audio interface. 6 channels of audio if we are talking the physical outs as well but I’d be happy with the four channels of audio over USB .

16 channels sounds great on paper but to give it some perspective the Roland TR-8 also offers 16 channels of audio when used as the main audio interface and I’m sure we would all agree that the TR-8 is less than ideal as main audio interface for a studio set up. The real genius of routing audio over USB is in fact allowing several channels of audio without having to use the device as the main audio interface. It must be extremely difficult as the only company to come close are the guys at ACESS Virus and the Total Integration software.

I’m asking Yamaha to play to their strengths here and that is create a synth that integrates tightly within a DAW without having to be the main audio interface and has a killer sound design engine (yes I know there are 2). Imagine having a hardware synth that potentially has the power to emulate 4 Uhe DIVA’s or 4 instances of Omnisphere running mega patches all in real time (I say 4 as I would request 4 audio channels over usb whilst not using as main audio interface) . That would be really liberating as a producer as I would not have to worry about the strain on my host cpu (crashes). I would be free to create.

I would suggest that currently Montage is being held back from living up to its hype due to the difficulties regarding programing it. 3 months after it’s released a browse through YouTube will not even present you with a handful of users videos who have made really unique sounds that have implemented the routing of the super knob or synthesis on the Montage. Honestly, no matter which side of the Yamaha Labs you are standing, that’s really quite disheartening as it was marketed as a synthesizer that would open up and merge the domains of FM and AWM2 using new techniques of modulation. I would really like to hear a real world example.

I think Yamaha have to ask themselves why this has not happened? Why are people struggling to get to grips with the working methodology of the Montage? During the design and research period when the synth porotype’s were being tested by fellow professionals, was there no feedback along the lines of ‘Hey this things is a bitch to program you better ship it with a VST interface?’. Maybe not so vulgar but I find it hard to believe that those who tested the synth from a contemporary sound design perspective gave the current incarnation a thumbs up.

If any of the above rings true then surely it’s time for a change in the update.

Please can we have a 2 filters at the common level so that element groups can be shaped in a traditional approach to sound design synthesis. Without such a change we are effectively blending elements via the linear volume parameter. We could get so much deeper with a common page filter. I guess it might reduce polyphony but that would be a small price to pay for the interesting and unique sounds that could be created.

Please can we have graphics of the wave shapes?

Please can there be a pool of say 150 interesting waves from which we can mix and match. Yes the choice of thousands of waves on paper is great but in practice with a touch screen and jog wheel it is very unsexy.

Please could we have an update allowing 4 channels of Audio / Midi to be streamed via USB in standalone mode on the Montage? i.e the Montage is not used as the main audio interface.

The Montage VST editor that Yamaha is working on: please could the design team go through YouTube and pick say 5 sound designers who are up-to-date. Please provide them with Montages (to borrow) and beta copies of the VST software you have thus far. Ask them to create sounds using it and then report back to the team on usability as well improvements that are needed. I am sure I speak for a lot of folk who would be happy to wait until the software side of the Montage is sorted before releasing a VST.

The power is undoubtedly in the Montage it’s just that the R&D need a little help packaging it into something the consumers are familiar with and able to get the most from. I’m sure they can though 
  1. more than a month ago
  3. # 2
Accepted Answer Pending Moderation
Not saying there is not room for interface improvement but Montage is not a huge departure from the Yamaha pro synth/rompler legacy. Mostly there is a 1to1 relationship of settings and general programming workflow going back to my MO6 or Motif ES generation and likely much before those. There may be some learning curve but it is also somewhat comforting to know the investment applies to past and, I would guess, future synths.

In the production environment, a computer interface makes a lot of sense. Here there is a great opportunity for Yamaha or a 3rd party to generate "middle ware" which presents a more intuitive interface for programming with the ability to perhaps group settings so you can "batch" change multiple settings at the same time. If a tablet app then touch/drag may make changing curves/envelopes easier.

I'd love to see Yamaha embrace the open architecture model and release an SDK to make both usb connected app and offline (user, library files) app development easier. In turn, the customer will have more choice and creative tools which match individual needs. I can see the argument against this (quality, competition, support, effort, maintenance, etc) but the current consumer environment shows customers really value this model. Reference Android, various other OSes.

As far as the waveform pictures - I'm with you at a certain level. I see this as a mixed bag where more consistency would be better. Waveform shapes are shown for LFO and destination curves (after selecting them). Most EQs and some effects show a waveform. But vocoder, for example, doesn't show the waveform or document the frequency domain parameters of the band-pass filters. I'm not at the keyboard to run through the complete list - but there are more examples.
  1. more than a month ago
  3. # 3
  • Page :
  • 1

There are no replies made for this post yet.
Be one of the first to reply to this post!
2018 © Yamaha Corporation of America and Yamaha Corporation. All rights reserved.