jmvalin: (opus)

Ever since we started working on Opus at the IETF, it's been a recurring theme. "You guys don't know how to test codecs", "You can't be serious unless you spend $100,000 testing your codec with several independent labs", or even "designing codecs is easy, it's testing that's hard". OK, subjective testing is indeed important. After all, that's the main thing that differentiates serious signal processing work from idiots using $1000 directional, oxygen-free speaker cable. However, just like speaker cables, more expensive listening tests do not necessarily mean more useful results. In this post I'm going to explain why this kind of thinking is wrong. I will avoid naming anyone here because I want to attack the myth of the $100,000 listening test, not the people who believe in it.

In the Beginning

Back in the 70s and 80s, digital audio equipment was very expensive, complicated to deploy, and difficult to test at all. Not everyone could afford analog-to-digital converters (ADC) or digital-to-analog converters (DAC), so any testing required using expensive, specialized labs. When someone came up with a new piece of equipment or a codec, it could end up being deployed for several decades, so it made sense to give it to one of these labs to test the hell out of it. At the same time, it wasn't too hard to do a good job in testing because algorithms were generally simple and codecs only supported one or two modes of operation. For example, a codec like G.711 only has a single bit-rate and can be implemented in less than 10 lines of code. With something that simple, it's generally not too hard to have 100% code coverage and make sure all corner cases are handled correctly. Considering the investments involved, it just made sense to pay tens or hundreds of thousands of dollars to make sure nothing blows up. This was paid by large telcos and their suppliers, so they could afford it anyway.

Things remained pretty much the same through the 90s. When G.729 was standardized in 1995, it still only had a single bit-rate, and the computational complexity was still beyond what a PC could do in real-time. A few years later, we finally got codecs like AMR-NB that supported several bit-rates, though the number was still small enough that you could test each of them.

Enter Opus

When we first attempted to create a codec working group (WG) at the IETF, some folks were less than thrilled to have their "codec monopoly" challenged. The first objection we heard was "you're not competent enough to write a codec". After pointing out that we already had three candidate codecs on the table (SILK, CELT, BroadVoice), created by the authors of 3 already-deployed codecs (iSAC, Speex, G.728), the objection quickly switched to testing. After all, how was the IETF going to review this work and make sure it was any good?

The best answer came from an old-time ("gray beard") IETF participant and was along the lines of: "we at the IETF are used to reviewing things that are a lot harder to evaluate, like crypto standards. When it comes to audio, at least all of us have two ears". And it makes sense. Among all the things the IETF does (transport protocols, security, signalling, ...), codecs are among the easiest to test because at least you know the criteria and they're directly measurable. Audio quality is a hell of a lot easier to measure than "is this cipher breakable?", "is this signalling extensible enough?", or "Will this BGP update break the Internet?"

Of course, that was not the end of the testing story. For many months in 2011 we were again faced with never-ending complaints that Opus "had not been tested". There was this implicit assumption that testing the final codec improves the codec. Yeah right! Apparently, the Big-Test-At-The-End is meant to ensure that the codec is good and if it's not then you have to go back to the drawing board. Interestingly, I'm not aware of a single ITU-T codec for which that happened. On the other hand, I am aware of at least one case where the Big-Test-At-The-End revealed someting wrong. Let's look at the listening test results from the AMR-WB (a.k.a. G.722.2) codec. AMR-WB has 9 bitrates, ranging from 6.6 kb/s to 23.85 kb/s. The interesting thing with the results is that when looking at the two highest rates (23.05 and 23.85) one notices that the 23.85 kb/s mode actually has lower quality than the lower 23.05 bitrate. That's a sign that something's gone wrong somewhere. I'm not aware of why that was the case or what exactly happened from there, but apparently it didn't bother people enough to actually fix the problem. That's the problem with final tests, they're final.

A Better Approach

What I've learned from Opus is that it's possible to have tests that are far more useful and much cheaper. First, final tests aren't that useful. Although we did conduct some of those, ultimately their main use ends up being for marketing and bragging rights. After all, if you still need these tests to convince yourself that your codec is any good, something's very wrong with your development process. Besides, when you look at a codec like Opus, you have about 1200 possible bitrates, using three different coding modes, four different frame sizes, and either mono or stereo input. That's far more than one can reliably test with traditional subjective listening tests. Even if you could, modern codecs are complex enough that some problems may only occur with very specific audio signals.

The single testing approach that gave us the most useful results was also the simplest: just put the code out there so people can use it. That's how we got reports like "it works well overall, but not on this rare piece of post-neo-modern folk metal" or "it worked for all our instruments except my bass". This is not something you can catch with ITU-style testing. It's one of the most fundamental principles of open-source development: "given enough eyeballs, all bugs are shallow". Another approach was simply to throw tons of audio at it and evaluate the quality using PEAQ-style objective measurement tools. While these tools are generally unreliable for precise evaluation of a codec quality, they're pretty good at flagging files the codec does badly on for further analysis.

We ended up using more than a dozen different approaches to testing, including various flavours of fuzzing. In the end, when it comes to the final testing, nothing beats having the thing out there. After all, as our Skype friends would put it:

Which codec do you trust more? The codec that's been tested by dozens of listeners in a highly controlled lab, or the codec that's been tested by hundreds of millions of listeners in just about all conditions imaginable?
It's not like we actually invented anything here either. Software testing has evolved quite a bit since the 80s and we've mainly attempted to follow the best practices rather than use antiquated methods "because that's what we've always done".

jmvalin: (opus)
For those who had been wondering what we thought of the recent France Telecom IPR declaration against Opus, here's our response. It's nice to be working for a company that isn't afraid of speaking publicly about patents.
jmvalin: (opus)

We just released Opus 1.1-alpha, which includes more than one year of development compared to the 1.0.x branch. There are quality improvements, optimizations, bug fixes, as well as an experimental speech/music detector for mode decisions. That being said, it's still an alpha release, which means it can also do stupid things sometimes. If you come across any of those, please let us know so we can fix it. You can send an email to the mailing list, or join us on IRC in #opus on irc.freenode.net. The main reason for releasing this alpha is to get feedback about what works and what does not.

Quality improvements

Most of the quality improvements come from the unconstrained variable bitrate (VBR). In the 1.0.x encoder VBR always attempts to meet its target bitrate. The new VBR code is free to deviate from its target depending on how difficult the file is to encode. In addition to boosting the rate of transients like 1.0.x goes, the new encoder also boosts the rate of tonal signals which are harder to code for Opus. On the other hand, for signals with a narrow stereo image, Opus can reduce the bitrate. What this means in the end is that some files may significantly deviate from the target. For example, someone encoding his music collection at 64 kb/s (nominal) may find that some files end up using as low as 48 kb/s, while others may use up to about 96 kb/s. However, for a large enough collection, the average should be fairly close to the target.

There are a few more ways in which the alpha improves quality. The dynamic allocation code was improved and made more aggressive, the transient detector was once again rewritten, and so was the tf analysis code. A simple thing that improves quality of some files is the new DC rejection (3-Hz high-pass) filter. DC is not supposed to be present in audio signals, but it sometimes is and harms quality. At last, there are many minor improvements for speech quality (both on the SILK side and on the CELT side), including changes to the pitch estimator.

Speech/music detector

Another big feature is automatic detection of speech and music. This is useful for selecting the optimal encoding mode between SILK-only/hybrid and CELT-only. Unlike what some people think, it's not as simple as encoding all music with CELT and all speech with SILK. It also depends on the bitrate (at very low rate, we'll use SILK for music and at high rate, we'll use CELT for speech). Automatic detection isn't easy, but doing so in real-time (with no look-ahead) is even harder. Because of that the detector tends to take 1-2 seconds before reacting to transitions and will sometimes make bad decisions. We'd be interested in knowing about any screw ups of the algorithm.

Bandwidth detection

The new encoder can also detect the bandwidth of the input signal. This is useful to avoid wasting bits encoding frequencies that aren't present in the signal. While easier than speech/music detection, bandwidth detection isn't as easy as it sounds because of aliasing, quantization and dithering. The current algorithm should do a reasonable job, but again we'd be interested in knowing about any failure.

jmvalin: (opus)
We finally made it! Opus is now standardized by the IETF as RFC 6716. See the Mozilla hacks post and the Xiph.Org press release for more details. Of course, feel free to help spread the word around.

We're also releasing both version 1.0.0, which is the same code as the RFC, and version 1.0.1, which is a minor update on that code (mainly with the build system). As usual, you can get those from http://opus-codec.org/

Thanks to everyone who contributed by fixing bugs, reporting issues, implementing Opus support, testing, advocating, ... It was a lot of work, but it was worth it.
jmvalin: (Default)

I just got back from the 84th IETF meeting in Vancouver. The most interesting part (as far as I was concerned anyway) was the rtcweb working group meeting. One of the topics was selecting the mandatory-to-implement (MTI) codecs. For audio, we proposed having both Opus and G.711 as MTI codecs. Much to our surprise, most of the following discussion was over whether G.711 was a good idea. In the end, there was strong consensus (the IETF believes in "rough consensus and running code") in favor of Opus+G.711, so that's what's going to be in rtcweb. Of course, implementers will probably ship with a bunch of other codecs for legacy compatibility purposes.

The video codec discussion was far less successful. Not only is there still no consensus over which codec to use (VP8 vs H.264), but there's also been no significant progress in getting to a consensus. Personally, I can't see how anyone could possibly consider H.264 as a viable option. Not only is it incompatible with open-source, but it's like signing a blank check, nobody knows how much MPEG-LA will decide to charge for it in the next years, especially for the encoder, which is currently not an issue for HTML5 (which only requires a decoder). The main argument I have heard against VP8 is "we don't know if there are patents". While this is true in some sense, the problem is much worse for H.264: not only are there tons of known patents for which we only know the licensing fees in the short term, but there's still at least as much risk when it comes to unlicensed patents (see the current Motorola v. Microsoft case).

jmvalin: (opus)
Three years after we first tried convincing the IETF to standardize an audio codec, Opus has finally been approved by the IETF. The only remaining step until it's officially an RFC is the RFC editor (fixing last minor issues, typos, ...). That should take in the order of 6-8 weeks (variable), at which point we'll have the RFC and the 1.0 release. Thanks to everyone who helped developing, testing, supporting or advocating Opus.
jmvalin: (Default)

As of today, it's really important that I don't forget to tell people not to do illegal stuff. That's because today, a new special law states (among other things) that (rough translation):

"Whoever causes, by an act, omission, help, encouragement, advice, consent, authorization, or order, someone to do something which is an offense under that law, then that person is deemed to have committed the same offence".

For those who can read French, here's the French text for that quote. What is still unclear is whether that clause applies recursively and if so, down how many levels. For example, if I don't tell you to tell Joe to tell Bob to tell George not to commit an offence under that law, have I committed an offence?

No, I do not live in China or North Korea, but in the Canadian province of Quebec. The reason why this law is apparently really necessary is that the government had to stop university students from savagely attacking police batons with their heads. I mean, there's a few minor issues, like the fact that the Quebec Bar association considers this new law to be unconstitutional, but hey who really cares about those sorts of details anymore.

jmvalin: (Default)
During LCA 2012, I got to meet face-to-face (for only the second time) with David Rowe and discuss Codec2. This led to a hacking session where we figured out how to save about 10 bits on LSP quantization by using vector quantization (VQ). This may not sound like a lot, but for a 2 kb/s codec, 10 bits every 20 ms is 500 b/s, so one quarter of the bit-rate. That new code is now in David's hands and he's been doing a good job of tweaking it to get optimal quality/bitrate. This led me to look at the rest of the bits, which are taken mostly by the pitch frequency (between 50 Hz and 400 Hz) and the excitation energy (between -10 dB and 40 dB). The pitch is currently coded linearly (constant spacing in Hz) with 7 bits, while the energy is coded linearly in dB using 5 bits. That's a total of 12 bits for pitch and energy. Now, how can we improve that?

The first assumption I make here is that David already checked that both gain and energy are encoded at the "optimal" resolution that balances bitrate and coding artefacts. To reduce the rate, we need a smarter quantizer. Below is the distribution of the pitch and energy for my training database.



So what if we were to use vector quantization to reduce the bit-rate. In theory, we could reduce the rate (for equal error) by having more codevectors in areas where the figure above shows more data. Same error, lower rate, but still a bad idea. It would be bad because it would mean that for some people, whose pitch falls into the range that is less likely, codec2 wouldn't work well. It would also mean that just changing the audio gain could make codec2 do worse. That is clearly not acceptable. We need to not just care about the mean square error (MSE), but also about the outliers. We need to be able to encode any amplitude with increments of 1-2 dB and any pitch with an increment around 0.04-0.08 (between half a semitone and a semitone). So it looks like we're stuck and the best we could do is to have uniform VQ, which wouldn't save much compared to scalar quantization.

The key here is to relax our resolution constraint above. In practice, we only need such good resolution when the signal is stationnary. For example, when the pitch in unvoiced frames jumps around randomly, it's not really important to encode it accurately. Similarly, energy error are much more perceivable when the energy is stable than when it's fluctuating. So this is where prediction becomes very useful, because stationary signals are exactly the ones that are easily predicted. By using a simple first-order recursive predictor (prediction = alpha*previous_value), we can reduce the range for which we need good resolution by a factor (1-alpha). For example, if we have a signal that ranges from 0 to 100 and we want a resolution of 1, then using alpha=0.1, the prediction error (current_value-prediction) will have a range of 0 to 10 when the signal is stationary. We still need to have quantizer values outside that range to encode variations, but we don't need a good resolution.

Now that we have reduced the domain for which we need good resolution, we can actually start using vector quantization too. By combining prediction and vector quantization, it's possible to have a good enough quantizer using only 8 bits for both the energy and the pitch, saving 4 bits, so 200 b/s. The figure below illustrates how the quantizer is trained, with the distribution of the prediction residual (actual value minus prediction) in blue, and the distribution of the code vectors in red. The prediction coefficients are 0.8 for pitch and 0.9 for energy.



First thing we notice from the residual distribution is that it's much less uniform and there's two higher-density areas that stand out. The first is around (0.3,0), which corresponds to the case where the pitch and energy are stationary and is about one fifth of the range for pitch (which has a prediction coefficient of 4/5) and one tenth of the range for energy (which has a prediction coefficient of 9/10). The second higher-density area is a line around residual energy of -2.5, and it corresponds to silence. Now looking at the codebook in red, we can see a very high density of vectors in the area of stationary speech, enough for a resolution of 1-2 dB energy and 1/2 to 1 semitone for pitch. The difference is that this time the high resolution is only needed for much smaller range. Now, the reason we see such a high density of code vectors around stationary speech and not so much around the "silence line" is that the last detail of this quantizer: weighting. The whole codebook training procedure uses weighting based on how important the quantization error is. The weight given to pitch and energy error on stationary voiced speech is much higher than it is for non-stationary speech or silence. This is why this quantizer is able to give good enough quality with 8 bits instead of 12.
jmvalin: (Default)

I just got back from linux.conf.au 2012 in Ballarat. The video for the talk I gave, Opus, the Swiss Army Knife of Audio Codecs, is now available on the Opus presentations page. For the Ogg-impaired, a lower-quality version is also available on YouTube.

For those who are into speech codecs, I also recommend watching David Rowe's presentation: Codec 2 - Open Source Speech Coding at 2400 bit/s and Below. His presentation was selected as one of the four best talks at LCA this year -- well worth watching.

jmvalin: (Default)

Those who have been following the Opus git repository in the past few weeks probably haven't noticed much work going on. The reason is pretty simple, most of the work has been going on elsewhere in an experimental branch (exp_wip3 names for now) of my private repository. The reason it's in an experimental branch is that its not fully converted to fixed-point and hasn't been tested on any frame size other than 20 ms. Here's an (incomplete) list of changes for now:

  • Really unconstrained VBR (not trying to keep the same average rate)
  • Tonality detection to give highly tonal audio a boost in bit-rate
  • (yet another) rewrite of the transient detection code
  • New dynamic allocation code that boosts the rate of bands that have significant spectral leakage caused by short blocks

Thanks to these changes, the quality has (as far as we can tell) gone up compared to the current master branch. I invite you to judge for yourself by comparing the audio coded with the current master branch with the audio coded with the new exp_wip3 experimental branch. This is 64 kb/s, so fairly low rate for stereo music. The original is here. Let me know what you think.

jmvalin: (Default)

(voir série rénovations)

Nous avons décidé de faire affaires avec Armoires Créabec pour nos armoires de cuisine suite à une recommandation. Créabec s'affiche comme un fabriquant d'armoires haut de gamme. Nous aimons le look général de nos armoires de cuisine. Les composantes sont de qualité et la fabrication des armoires elles-mêmes est très bien. Par contre, c'est une toute autre histoire pour ce qui est du service, du transport et de l'installation des armoires! Nous avons été conseillés par Christian. Nous avions un échéancier très serré pour faire tous nos travaux avant d'enménager dans notre nouvelle maison. Nous avons donc fait des pieds et des mains pour choisir rapidement nos armoires de cuisine, avec la promesse du vendeur que la cuisine serait prête à temps. Une fois de contrat signé, la date estimée avait déjà glissé de 1 semaine. En cours de route, d'autres délais sont apparus. Bref, notre cuisine a été installée plus de 2 semaines plus tard que la date initialement promise. S'ajoute à cela passablement de difficultés à avoir des retours à nos appels téléphoniques. D'autre part, une fois tout installé, nous avons constaté qu'un grand nombre de portes d'armoires et caissons étaient égratignés et endommagés, que les joints des o-gees et cache-néons étaient mal alignés, et que les murs étaient abimés (pas juste la peinture - nous devons refaire le plâte à certains endroits). Christian est alors venu constater par lui-même l'état de la cuisine. Il a admis que: "C'est la première fois que je vois une cuisine aussi cochée que ça". Plusieurs portes d'armoire ont donc été retouchées en usine. Il a ensuite été convenu que l'installateur reviendrait pour remettre les portes, remplacer un panneau endommagé, réparer d'autres égratignures de surface et refaire les joints de o-gees/cache-néons problématiques. L'installateur, Rodrigue, est donc revenu avec comme seul outil: un crayon de cire! Par la suite, Christian nous a assuré qu'il aller corriger ce qui restait et qu'il accompagnerait son installateur lors de sa prochaine visite.

Deux semaines plus tard, le discours était bien different. Christian annonce la position de la direction: "Il n'y aura plus rien qui va être fait sur votre cuisine". Essayant de comprendre le changement de cap, Christian me sert des insultes et des attaques sur le plan personnel: "Votre attitude n'est pas bonne"; "Vous ne dégagez pas une bonne énergie"; "De toute façon vous ne serez jamais satisfaits"; "Vous êtes pas facile comme personne". Bref, nous sommes extrêmement déçus du dénouement de cette histoire. Les propos qui ont été tenus à notre égard sont tout à fait inadmissibles, surtout considérant que nous y avons laissé une somme d'argent considérable pour un travail qui ne nous satisfait pas et qui ne rencontre pas les promesses qui nous ont été faites. Disons qu'on est loin du principe que "le client a toujours raison"!

Conclusion: Ne pas réparer une erreur est souvent pire que l'erreur elle-même et c'est exactement la position que Créabec a choisi de prendre dans notre cas. Nous ne referons certainement pas affaires avec Armoires Créabec dans le futur. Ce que nous retenons de cette expérience est de prendre plus de temps pour faire le choix de ses armoires, mais aussi du farbiquant d'armoires. Aussi, il faut mieux éviter autant que possible de payer le montant total "à la livraison" comme nous avons fait et plutôt payer "après l'installation".

jmvalin: (Default)

(voir série rénovations)

Nous avons fait affaire avec Entreprise Jeannot Paquette Inc. pour faire installer un nouveau système de chauffage au gaz, installer une nouvelle salle de bain au sous-sol, ainsi que pour l'installation de plomberie de la nouvelle cuisine. Nous sommes très satisfaits de l'installation du système de chauffage. Les travaux ont été faits proprement, dans les délais, et sans surprise. Le seul pépin est venu de Gaz Métro (d'un sous-contracteur pour être plus précis) qui a dépassé la date limite de branchement d'une semaine (pas pratique avec une cuisinière au gaz).

Côte plomberie, les travaux de "rough" (passer les drains et les tuyaux pour la salle de bain du sous-sol) se sont bien déroulés et ils ont été faits dans les délais -- souvent avec très peu de pré-avis de notre part -- ce qui a été très apprécié. Pour ce qui est de la finition, il y a eu quelques pépins avec la pose de l'évier (fuite dans le drain) de la cuisine et l'assemblage de la douche. Le problème avec le drain de l'évier a été réglé rapidement. Pour ce qui est de la douche, nous avons acheté une douche OVÉ tout en verre qui semble difficile à assembler. Même après un 2e passage du plombier pour ajuster l'assemblage (les panneaux de verre n'arrivent pas à angle droit), il y a toujours des ajustements à faire. Nous attendons un retour du plombier à ce sujet (à suivre).

Tous les plombiers qui sont venus faire des travaux (3 en tout) étaient très sympatiques, minutieux et ponctuels, ce qui est toujours agréable et rassurant. Aussi, une (petite) partie des travaux a été faite "à l'heure" et le temps a été compté de façon tout à fait honnête. Nous avons été rapidement en confiance avec l'Entreprise Jeannot Paquette.

Conclusion: Entreprise Jeannot Paquette offre un excellent service. Nous avons grandement apprécié que les quelques problèmes d'installation de la plomberie aient été réglé rapidement et sans difficulté. Nous referons affaires avec Jeannot Paquette avec plaisir si l'occasion se représente.

jmvalin: (Default)

(voir série rénovations)

Nous avons demandé à Toiture Alpine de remplacer notre vieille toiture en goudron par une membrane élastomère. Le représentant était très courtois et nous avons eu un estimé en peu de temps. L'installation s'est bien déroulée, quoique avec deux jours de retard sur l'horaire (dont une journée sans avertir du délai). L'installateur a pris le temps de nous expliquer le pour et le contre de certaines options lors de l'installation. Je n'ai pas encore pu inspecter les travaux (pas encore d'échelle), mais je peux au moins dire que le nouveau toit a déjà résisté à quelques orages.

Conclusion: Nous referons probablement affaire avec Toiture Alpine si l'occasion se represente (pas trop tôt j'espère).

jmvalin: (Default)
I just got the news today that LCA 2011 has accepted my talk proposal: "Opus, the Swiss Army Knife of Audio Codecs". I'll be presenting it in Ballarat, Australia in January. If there's any specific topic you'd like me to include in the talk, please let me know (by email or comment on this post).
jmvalin: (Default)

(voir série rénovations)

Nous avons fait faire notre sous-sol, une chambre à coucher, et une galerie par Entreprises Spécialisées Enr. d'Acton Vale. Ils ont fait un travail de bonne qualité et très solide. Par contre, nous regrettons fortement les avoir payés à l'heure. Pour dix jours de travaux, nous avons été facturés 294.5 heures, soit près de 15 heures par jour par travailleur. De plus, les heures de transport (près de 6 hres par jour) nous ont été facturés au plein tarif. Nous n'avons jamais pu avoir le détail des heures travaillées (facture très vague, pas de réponse précise au téléphone), mais notre estimé est bien en-deça.

Conclusion: Ne jamais payer à l'heure pour des gros travaux, mais s'entendre sur un coût forfaitaire. Prendre entente sur la facturation des heures de transport. Nous sommes satisfaits du travail effectué par l'équipe des Entreprises Spécialisées Enr., mais à cause de la facturation que nous avons estimée excessive, nous ne referons probablement plus affaire avec eux.

jmvalin: (Default)
Since yesterday, the IETF audio codec requirements are now published as RFC 6366. While the requirements aren't by themselves interesting (why discuss abstract requirements when you can discuss actual running code?), it's an important milestone in that it's the first document published by the Working Group. It also means one less source of pointless arguments. The guidelines document is now next in line and should go to IETF last call soon.

Now the interesting part of the Opus codec itself. That's the only document that really matters. That one should go to Working Group Last Call (WGLC) pretty soon (possibly next week or two). In the mean time, we're working on improving the clarity of the draft, cleaning up the code and fixing all the last few issues that have been reported since the first WGLC. Stay tuned.
jmvalin: (Default)

(voir série rénovations)

Tapis Nadon est une entreprise familiale qui offre toute une variété de couvre-planchers (tapis, bois et céramique). Nous sommes très satisfaits, autant du service, de la pose que du tapis lui même. Le vendeur, M. Nadon lui-même, a bien pris le temps de nous conseiller et de nous montrer tous (et je dis bien tous!) ses tapis. Du côté de la pose, l'installateur a bien pris le temps de niveler (autant que possible) le plancher qui était très croche. La pose a été faite méticuleusement selon le "pattern" des tuiles de tapis que nous avions choisi. Le prix était aussi raisonnable (similaires à d'autres).

Conclusion: Il y a encore des gens qui ont le souci du détail. Nous referons certainement affaires avec Tapis Nadon si l'occasion se represente.

jmvalin: (Default)
(première entrée de la série rénovation)

Tremex est une entreprise familiale devenue un distributeur Solaris depuis quelques années. Nous sommes très satisfaits des fenêtres et de la bay window (fabriquées par Solaris). La porte patio est bien. Il y a eu quelques pépins avec la serrure de la porte patio, mais le service après vente de Solaris a rapidement réglé le problème. Nous avons été conseillés par Éric qui nous a donné un très bon service et qui a été d'une très grande aide avec les communications avec l'installateur. L'installateur a fait un bon travail, mais il a été difficile de communiquer avec lui (ne retournait pas les appels). Il etait beaucoup plus facile communiquer via Éric. Le choix des options de la bay window a été difficile car ni Tremex, ni l'installateur ne pouvait nous fournir de modèles types - le problème a été réglé depuis puisque Tremex a bonifié sa salle de montre avec les différents modèles de Bay windows possibles. Notre Bay window n'est pas tout à fait comme nous l'avions demandée au départ, mais elle est suffisemment bien faite pour la conserver ainsi et en être satisfaits. Le prix était "dans la moyenne" des soumissions. Nous considérons que les produits Solaris ont un très bon rapport qualité prix. Les fenêtres en PVC bloquent étonnament bien le bruit et le mécanisme fonctionne très bien.

Conclusion: L'installation compte souvent autant que le produit et un bon service après vente, c'est important. Si c'était à recommencer, nous referions probablement affaires avec Tremex.

jmvalin: (Default)

Ma conjointe et moi avons fait faire d'importants travaux à la maison que nous avons acheté récemment. Les travaux terminés (aujourd'hui!), nous avons décidé de partager notre expérience dans le but qu'elle puisse profiter à d'autres. En effet une des difficultés principales que nous avons eues a été de trouver des recommendations d'entreprises pour effectuer les travaux. Nous avions choisi de ne pas faire affaires avec un entrepreneur général et de coordonner les travaux nous-mêmes. Nous avons donc appris plusieurs choses à faire et (surtout) à ne pas faire quand on gère des travaux de rénovations. Voici donc une série d'entrées sur notre expérience de rénovation et les conclusions que nous en avons tirées.

MàJ: Index de la série:

  1. Tremex Portes et fenêtres
  2. Tapis Nadon
  3. Entreprises Spécialisées Enr.
  4. Toiture Alpine
  5. Entreprise Jeannot Paquette Inc.
  6. Armoires Créabec

jmvalin: (Default)
I spent my last week in Quebec City at the 81th IETF meeting. The most important meeting there for me was the codec WG. The good news is that there's been a lot of progress in that meeting. A few issues with the Opus bit-stream (e.g. padding, frame packing) were resolved and the chairs are planning a second working group last call in four weeks. After that if all goes well, the codec can go to IETF last call and then RFC.

My week at the IETF meeting was also my first week at my new job working for Mozilla. I've been hired specifically to work on Opus and other codec/multimedia development, so I should have a lot more time for that than I used to. First thing on my list: finishing the Ogg mapping for Opus and releasing an Ogg encoder and decoder.

Profile

jmvalin: (Default)
jmvalin

March 2023

S M T W T F S
   1234
567891011
12131415161718
1920212223 2425
262728293031 

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 7th, 2025 04:24 pm
Powered by Dreamwidth Studios