Welcome to No Limit Sound Productions

Company Founded
2005
Overview

Our services include Sound Engineering, Audio Post-Production, System Upgrades and Equipment Consulting.
Mission
Our mission is to provide excellent quality and service to our customers. We do customized service.

Monday, March 30, 2020

Q. Is it best to synchronise all my digital gear using a word clock generator?

By Hugh Robjohns
Q. Is it best to synchronise all my digital gear using a word clock generator?
Please could Hugh Robjohns write a comprehensive article explaining the operational advantages and disadvantages of using a word-clock signal to synchronise studio equipment as compared to alternative methods? Further, if the audio has been reference-clocked as it was recorded, does the replay chain (perhaps including multiple downstream signal processors) still require a synchronisation reference or is the clocking information embedded in the recorded data sufficient to hold all the downstream equipment in the correct relationship? Finally, with respect to jitter, will using an external master clock to synchronise the equipment chain prevent it?
SOS Forum Post

Technical Editor Hugh Robjohns replies: A new series of articles concerning various practical aspects of working with digital audio is planned for the near future, but in the meantime I'll have a bash at tackling your list of clocking questions.

A lot of equipment accepts only a simple word clock reference input, rather than AES or composite video references, purely because it is far easier and cheaper to implement. However, there is no significant technical advantage in using only the word clock format. Some might argue that the ability to daisy-chain a word clock signal around a number of devices using BNC T-pieces makes word clock superior since it provides a cheap and convenient way of distributing a reference clock. The problem is that while this approach can work in controlled situations, there are inherent dangers involved if the equipment isn't (or can't be) configured correctly, or the setup is changed without correctly re-engineering the chain. A proper star-shaped distribution of clock signals from a dedicated hub or master generator, using word clock, AES or a combination of both, is a far better and more reliable approach.
The Drawmer D-Clock provides a total of 20 word clock outputs, but is it the answer?The Drawmer D-Clock provides a total of 20 word clock outputs, but is it the answer?

As to your second question, a digital recording has, by definition, to be clocked from a reference at the source. That reference is most often the internal crystal clock of an A-D converter. At each subsequent transfer of the digital audio from one machine to the next (assuming the use of AES, S/PDIF, SDIF3, MADI or ADAT interfaces) the clocking information is fully embedded and passed along with the audio. Assuming the equipment is configured to extract the embedded reference clock from its input signal, then it is not strictly necessary to provide separate reference clock signals from a master generator system. However, in larger setups there are significant practical and technical advantages in using a central master clock to provide stable references to the entire system.

Jitter is the enemy of all clocking systems because it introduces a degree of uncertainty in the timing of samples, which translates as a rise in the noise floor with various noise modulation effects, and often causes a blurring or instability in the stereo image at the A-D and D-A conversion stages. However, it should be noted that these jitter effects only become an issue at the points where audio is converted between the analogue and digital worlds — digital transfers between equipment are completely unaffected by even quite severe levels of jitter.

A good-quality master clock should have less intrinsic jitter than most individual devices, but that isn't a guarantee that you'll have a jitter-free system. There are three main causes of jitter: poor clock design, poor clock-recovery circuits (the part of the A-D/D-A converter which extracts the clock data from an incoming digital audio or reference signal), and the effects of the interconnecting cables. Of these, cable effects and poor clock-recovery circuits cause the most problems. The capacitance inherent in cables limits the slew rate of the data — how fast the signal can transition from one binary state to the other. At the output of a piece of digital equipment the data might switch from one state to the other in a beautifully crisp square wave, but by the time it reaches the input of another device the cable capacitance will have rounded it out into something looking more like a triangle wave. The clocking reference timing is generally taken from the points where the data transitions cross the nominal centre line of the waveform, and if these 'vertical transitions' have become sloping lines because of the cable capacitance, the precise point of transition becomes rather vague — that's jitter!

The greater the capacitance of the cable, the worse this problem becomes, so short, high-quality, low-capacitance cables will preserve clocking information far better than overly long, cheap, high-capacitance ones. Obviously, fibre optic cables don't suffer from electrical capacitance, but they have an optical equivalent, which is dispersion. If the optical quality of the plastic or glass is not optimised, the pulses of light can be degraded in such a way that the transitions between light and dark become (quite literally) blurred, and that causes exactly the same kind of jitter problems.

Fortunately, a good clock-recovery circuit can reject the effects of cable jitter, and some companies put a lot of effort into designing good jitter-rejecting clock recovery circuits. The problem is that most techniques which reject jitter to a high degree are very slow to respond and synchronise in the first place, so a practical compromise has to be reached, trading jitter rejection for responsiveness.
So, given that cables induce clock jitter, and that some jitter often seeps through the clock-recovery circuitry, it won't come as a surprise to learn that it is often better to use the A-D converter's own internal crystal clock as the reference, both for the conversion itself and the rest of the digital system, rather than use an external reference. This assumes that the converter has a good-quality low-jitter clock, of course. If it doesn't, you might get better results clocking from a better-quality external clock, although you are then at the mercy of the jitter-rejection capability of the device's clock-recovery circuit.
Sometimes there is no choice but to externally clock an A-D converter, as is the case when you need to synchronise several separate A-D converters for a multi-channel recording, for example. Using good-quality converters, linked with short clock cables to a common master reference clock would be the best and most practical solution in this case. The only alternative would be to run each converter on its internal clock, and then use sample-rate converters to resynchronise their outputs to a common reference — an expensive option, and one which might introduce a whole different set of unwanted artifacts!


Published May 2006

Friday, March 27, 2020

Q. Which converter should I set as the clock master?

By Hugh Robjohns
If you use an external A-D converter such as the Audient ASP880 (pictured) then it usually, though not always, makes sense to set it rather than your audio interface to be the clock master.If you use an external A-D converter such as the Audient ASP880 (pictured) then it usually, though not always, makes sense to set it rather than your audio interface to be the clock master.
I love to read Hugh Robjohns’ stuff, even if the deep technical knowledge sometimes (often?) overwhelms me! I just read his article about the need for master clocks. My own setup is an Audient ASP880 eight-channel preamp connected via ADAT to an Avid HD Omni working with Pro Tools 12.4 HD Native. I was wondering if it was better to set the Audient as master for recording (and the Omni as slave) since the Audient does the A-D conversion, or does it make no difference? And, should I set the Omni as master again when I’m mixing? Thanks in advance.

Guido Jakobi

SOS Technical Editor Hugh Robjohns replies: I’m glad you enjoy reading our technical features, and thanks for the kind words! As you’ve gathered, the most critical element in any digital system is the A-D conversion, since that’s usually where the sound quality is really ‘set in stone’. Unless you’re sending a signal out through the D-A to, say, an analogue compressor and recording the result back through an A-D, then if you don’t like the sound of a particular D-A converter you can always change it later, but once a signal has been digitised by an A-D, you’re stuck with the data created at that point!

One of the most important factors in A-D quality is the stability of the sample clock, and I think it has been well established now that most converters deliver their best technical performance when running off their internal clock, rather than being slaved from an external one. So, if the Audient is handling the most important analogue inputs to your system, then I would agree with your thinking that running the ASP880 as the system master clock (and setting the Omni HD to operate as a clock slave from its ADAT input) would be the best technical approach, and the most likely configuration to give the highest possible conversion quality when recording.

As it happens, this arrangement would also be the easiest and most convenient way to wire up your specific system, too. And it’s also the most sensible arrangement if using the Audient’s A-D converter and the Omni HD’s D-A converter simultaneously to hook up external analogue processors, as the A-D stage is usually slightly more sensitive to clocking than the D-A one.

But when it comes to mixing, and assuming that you are using the D-As in the Omni HD for your monitoring feeds, then it would make sense from a technical perspective to switch the Omni HD to be the master clock. To use the Omni HD as the clock master, you’ll need to run a BNC-BNC cable from the Omni’s word-clock output to the ASP880’s word-clock input (with the 75Ω termination switched on). Set the Omni HD to run on its internal clock, and the ASP880 to accept the external clock input.
Of course, there’s nothing to stop you experimenting with alternative clocking arrangements to see if you prefer the sound of one configuration over another — but be aware that any sonic differences are likely to be extremely subtle!

In practice, I suspect you probably won’t notice any sound quality benefit from re-setting the Omni HD to be the clock master when mixing, and I’d be tempted to leave it in slave mode throughout, just for the convenience factor. Rest assured that either way, the clock master selection will have absolutely no affect on the quality of your digital mix files at all — it will only affect the performance of the D-A and, therefore, the local analogue monitoring.



Published March 2017

Wednesday, March 25, 2020

Q. How can I find out if my melody has been used before?

By Mike Senior
I have a melody which I am absolutely sure I made up when I was a kid and I want to use it in a composition I am working on. However, the things kids make up can be 'derivative' at the best of times, and I need to confirm that I hadn't heard it before. I am fairly sure that I did write it myself because I have never heard it since in the following 20-odd years, but it would be nice to be sure.
Melodyhound website.Do you know of any melody databases that could help me find out if I have lifted the tune from somewhere? I found one on the web (www.melodyhound.com) where you enter the Parson's Code of your tune and it checks a database, but I have no way of knowing how exhaustive their database is. Can you offer any advice?

SOS Forum Post

Reviews Editor Mike Senior replies: There are a number of melody databases accessible on-line, using a variety of different methods to catalogue and recognise melodies, though none of them is by any means exhaustive. You're probably no worse off humming your melody to friends and family of different ages to see if it rings any bells. However, if you are planning to release a track containing this melody, you may need to take the issue a bit more seriously.

Disregarding the significant moral issues here, the legal situation in the case of similar melodies, as I understand it, is very much a grey area, and far less straightforward than in cases of mechanical copyright infringement (using a sample without permission). If the context and lyrics have completely changed, then the prosecution would be obliged to show not only a serious similarity in both note pattern and rhythm, but would also probably have to show that you had sufficient access to the material from which it was taken. An interesting case study is the 1984 copyright infringement case against the Bee Gees, where unknown songwriter Ronald Selle accused the brothers Gibb of lifting the melody from his song 'Let It End' for their hit 'How Deep Is Your Love'. You can find information on the case by searching the web for 'Selle v. Gibb'. Alternatively, Columbia Law School's Music Plagiarism Project web site, which can be freely accessed atwww.ccnmtl.columbia.edu/projects/law/library/entrance.html, features a wealth of information on music copyright infringement cases in the US over the last 150 years, in the form of documents, scores, audio and video files.

In the Bee Gees case, though the melodies in the two songs were agreed to be all but identical (even Maurice Gibb mistook part of Selle's song for the Bee Gees own while on the stand) and Selle had written and copyrighted 'Let It End' several years before the Bee Gees wrote 'How Deep Is Your Love', the claim of copyright infringement was rejected by the judge. Why? Because Selle couldn't prove that the Bee Gees could have heard his song before they wrote theirs, and couldn't rule out a common source which inspired both songs. 'Let It End' was rejected by a number of record companies and never released, and Selle was forced to admit that there were similarities between both songs and a number of others, including a previous Bee Gees hit. It would seem that in this kind of case, the onus is on the party making the accusation of plagiarism to show not only that the similarities in melody are such that they can only be explained by copying, but also that the alleged copyists could definitely have had access to the original melody and that they couldn't have both copied it from somewhere else.

Although I'm no legal professional, I have been told that this last point can give rise to a defence against accusations of copyright infringement, namely to cite a piece of music that's in the public domain, such as an old classical work or a piece of folk music, as the source of your contested melody. Quite a few composers of high-profile music have sought out this kind of copyright-free music in the past, as an insurance policy in case anyone should have a pop at them. Accusations of plagiarism are more often than not groundless (at least, I'd like to think so), and the more successful a composition is, the more likely it is that someone will believe that the money and fame should belong to them. Having a piece of copyright-free music to claim as a source could squash any suit before it gets off the ground. Again, there are databases which can be used to look for suitable material, but they're rare, notation-heavy, and all of them index the melodies differently. Finding instances of a given melody is a long-winded and painstaking job (I speak from first-hand experience on this one!), even if you have access to the books and you have a musicology degree.

If you're worried that your melody is a knock-off, go ahead and record it anyway. When you listen to the finished article, with your lyrics, your arrangement and your performance, you may well find that it is your song after all.



Published January 2004