Subscribe: Thundering Herd
Added By: Feedage Forager Feedage Grade B rated
Language: English
api  audio  drive  eavench vprofile  element  firefox  full screen  full  fullscreen  media  mode  screen mode  screen  video  windows 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Thundering Herd

Thundering Herd

The more drums we have in our kit, the more jobs we can handle.

Updated: 2018-03-27T01:07:59.950+13:00


Firefox Media Playback Team Review Policy


Reviews form a central part of how we at Mozilla ensure engineering diligence. Prompt, yet thorough, reviews are also a critical component in maintaining team velocity and productivity. Reviews are also one of the primary ways that a distributed organization like Mozilla does its mentoring and development of team members.

So given how important reviews are, it pays to be deliberate about what you're aiming for.

The senior members of the Firefox Media Playback team met in Auckland in August 2016 to codify the roadmap, vision, and policy for the team, and and one of the things we agreed upon was our review policy.

The policy has served us well, as I think we've demonstrated with all we've achieved, so I'm sharing it here in the hope that it inspires others.
  • Having fast reviews is a core value of the media team.
  • Review should be complete by end of next business day.
  • One patch for one logical scope or change. Don't cram everything into one patch!
  • Do not fix a problem, fix the cause. Workarounds are typically bad. Look at the big picture and find the cause.
  • We should strive for a review to be clear. In all cases it should be clear what the next course of action is.
  • Reviews are there to keep bad code out of the tree.
  • Bad code tends to bring out bad reviews.
  • Commit message should describe what the commit does and why. It should describe the old bad behaviour, and the new good behaviour, and why the change needs to be made.
  • R+ means I don’t want to see it again. Maybe with comments that must be addressed before landing.
  • R- means I do want to see it again, with a list of things to fix.
  • R canceled means we’re not going to review this.
  • Anyone on the media team should be expected to complete a follow up bug.
  • It’s not OK for a reviewer to ask a test to be split out from a changeset, provided the test is related to the commit. By the time a patch gets to review, splitting the test out doesn’t create value, just stop-energy.
  • Review request. If response is slow, ping or email for a reminder, otherwise find another reviewer.
  • Don’t be afraid to ask when the review will come. The reply to “when” can be “is it urgent?”
  • Everyone should feel comfortable pointing out flaws/bugs as a “drive by”.
  • Give people as much responsibility as they can handle.
  • Reviewers should make it clear what they haven’t reviewed.
  • American English spelling, for comments and code.
  • Enforce Mozilla coding style, and encourage auto formatters, like `./mach clang-format`.
  • Use reviewboard. Except when you can’t, like security patches.

Not every bit of code you write needs to be optimal


It's easy to fall into the trap of obsessing about performance and try to micro-optimize every little detail in the code you're writing. Or reviewing for that matter. Most of the time, this just adds complexity and is a waste of effort.

If a piece of code only runs a few (or even a few hundred) times a second, a few nanoseconds per invocation won't make a significant difference. Chances are the performance wins you'll gain by micro optimizing such code won't show up on a profile.

Given that, what should you do instead? Code is read and edited much more than it is written, so optimize for readability, and maintainability.

If you find yourself wondering whether a piece of code is making your program slow, one of the first things you should do is fire up a profiler, and measure it. Or add telemetry to report how long your function takes in the wild. Then you can stop guessing, and start doing science.

If data shows that your code is slow, by all means optimize it. But if not, you can get more impact out of your time by directing your efforts elsewhere.

How to install Ubuntu 17.04 on Dell XPS 15 9550


I had some trouble installing Ubuntu 17.04 to dual-boot with Windows 10 on my Dell XPS 15 9550, so documenting here in case it helps others...

Once I got Ubuntu installed, it runs well. I'm using the NVIDIA proprietary driver, and I've had no major issues with hardware yet so far.

Most of the installation hurdles for me were caused by Ubuntu not being able to see the disk drive while it was operating in Raid mode, and UEFI/Secure Boot seemed to block the install somehow.

The trick to getting past these hurdles was to set Windows to boot into Safe Mode and then switch the disk drive to AHCI and disable UEFI in the BIOS before booting back into Windows in Safe Mode, and then switching Windows back to non-Safe Mode.

I found rcasero's notes on installing Ubuntu on Dell XPS 15 9560 useful.

Detailed steps to install...
  1. (If your Windows partition is encrypted, print out a copy of your BitLocker key. You'll need to enter this on boot after changing anything in your BIOS.
  2. Boot into Windows 10.
  3. I also needed to resize my main Windows partition from inside Windows; the Ubuntu installer seemed unable to cope with resizing my encrypted Windows partition for some reason. You can resize your main Windows partition using Windows' "Create or edit Disk Partitions" tool.
  4. Configure Windows to boot into safe mode: Press Win+R and run msconfig.exe > Boot > Safe Mode. Reboot.
  5. Press the F12 key while the BIOS splash screen comes up. Just repeatedly pressing it while the machine is booting seems to be the most reliable tactic.
  6. In the BIOS menu, BIOS Setup > System Configuration > SATA Operation, change "RAID On" to "AHCI".
  7. In the BIOS menu, disable Secure Boot.
  8. Reboot into Windows. You'll need to enter your BitLocker key to unlock the drive since the BIOS changed. Windows will boot into Safe Mode. If you don't have your Windows install set to boot into Safe Mode, you'll get a BSOD.
  9. Once you've booted into Windows Safe Mode, you can configure Windows to boot in normal (non-Safe Mode) with msconfig.exe > Boot > Safe Mode again.
  10. Reboot with your Ubuntu USB Live Disk inserted, and press F12 while booting to select to boot from the Live USB disk.
  11. The rest of the install Just Worked.
  12. Once you've installed Ubuntu, for better reliability and performance, enable the proprietary GPU drivers, in System Settings > Software and Updates > Additional Drivers. I enabled the NVIDIA and Intel drivers.
  13. I've found the touchpad often registers clicks while I'm typing. Turning off System Settings > Mouse and Touchpad > "Tap to click" fixed this and gives pretty good touchpad behaviour.
  14. Firefox by default has its hardware accelerated layers disabled, but force-enabling it seems to work fine. Open "about:config" in Firefox, and toggle "layers.acceleration.force-enabled" to true. Restart Firefox.

Firefox video playback's skip-to-next-keyframe behavior


One of the quirks of Firefox's video playback stack is our skip-to-next-keyframe behavior. The purpose of this blog post is to document the tradeoffs skip-to-next-keyframe makes.The fundamental question that skip-to-next-keyframe answers is, "what do we do when the video stream decode can't keep up with the playback speed?Video playback is a classic producer/consumer problem. You need to ensure that your audio and video stream decoders produce decoded samples at a rate no less that the rate at which the audio/video streams need to be rendered. You also don't want to produce decoded samples at a rate too much greater than the consumption rate, else you'll waste memory.For example, if we're running on a low end PC, playing a 30 frames per second video, and the CPU is so slow that it can only decode an average of 10 frames per second, we're not going to be able to display all video frames.This is also complicated by our video stack's legacy threading model. Our first video decoding implementation did the decoding of video and audio streams in the same thread. We assumed that we were using software decoding, because we were supporting Ogg/Theora/Vorbis, and later WebM/VP8/Vorbis, which are only commonly available in software.The pseudo code for our "decode thread" used to go something like this:  while (!AudioDecodeFinished() || !VideoDecodeFinished()) {  if (!HaveEnoughAudioDecoded()) {    DecodeSomeAudio();  }  if (!HaveEnoughVideoDecoded()) {    DecodeSomeVideo();  }  if (HaveLotsOfAudioDecoded() && HaveLotsOfVideoDecoded()) {    SleepUntilRunningLowOnDecodedData();  }}  This was an unfortunate design, but it certainly made some parts of our code much simpler and easier to write.We've recently refactored our code, so it no longer looks like this, but for some of the older backends that we support (Ogg, WebM, and MP4 using GStreamer on Linux), the pseudocode is still effectively (but not explicitly or obviously) this. MP4 on Windows, MacOSX, and Android in Firefox 36 and later now decode asynchronously, so we are not limited to decoding only on one thread.The consequence of decoding audio and video on the same thread only really bites on low end hardware. I have an old Lenovo x131e netbook, which on some videos can take 400ms to decode a Theora keyframe. Since we use the same thread to decode audio as video, if we don't have at least 400ms of audio already decoded while we're decoding such a frame, we'll get an "audio underrun". This is where we don't have enough audio decoded to keep up with playback, and so we end up glitching the audio stream. This sounds is very jarring to the listener.Humans are very sensitive to sound; the audio stream glitching is much more jarring to a human observer than dropping a few video frames. The tradeoff we made was to sacrifice the video stream playback in order to not glitch the audio stream playback. This is where skip-to-next-keyframe comes in.With skip-to-next-keyframe, our pseudo code becomes:while (!AudioDecodeFinished() || !VideoDecodeFinished()) {  if (!HaveEnoughAudioDecoded()) {    DecodeSomeAudio();  }  if (!HaveEnoughVideoDecoded()) {    bool skipToNextKeyframe =      (AmountOfDecodedAudio < LowAudioThreshold()) ||       HaveRunOutOfDecodedVideoFrames();    DecodeSomeVideo(skipToNextKeyframe);  }  if (HaveLotsOfAudioDecoded() && HaveLotsOfVideoDecoded()) {    SleepUntilRunningLowOnDecodedData();  }}We also monitor how long a video frame decode takes, and if a decode takes longer than the low-audio-threshold, we increase the low-audio-threshold.If we pass a true value for skipToNextKeyframe to the decoder, it is supposed to give up and skip its decode up to the next keyframe. That is, don't try to decode anything between now and the next keyframe[...]

How to prefetch video/audio files for uninterrupted playback in HTML5 video/audio


Sometimes when you're playing a media file using an HTML5

Why does the HTML fullscreen API ask for approval after entering fullscreen, rather than before?


The HTML fullscreen API is a little different from other JS APIs that require permission, in that it doesn't ask permission before entering fullscreen, it asks forgiveness *after* entering fullscreen.

Firefox's fullscreen approval dialog, which asks "forgiveness" rather than permission.
The rationale for having our fullscreen API implementation ask forgiveness rather than request permission is to make it easier on script authors.

When the original API was designed, we had a number of HTML/JS APIs like the geolocation API that would ask permission. The user was prompted to approve, deny, or ignore the request, though they could re-retrieve the request later from an icon in the URL bar to approve the request at a later time.

Geolocation approval dialog, from Dive Into HTML's geolocation example.
The problem with this design for script authors is that they can't tell if the user has ignored the approval request, or is just about to go back and approve it by bringing up the geolocation door-hanger again.

This model of requesting permission has been seen to cause problems for web apps in the wild using the geolocation API. Often if a user ignores the geolocation permission request, the web app doesn't work right, and if you approve the request some time later, the site often doesn't start working correctly. The app just doesn't know if it should throw up a warning, or if it's about to be granted permission.

So the original developers of the fullscreen spec (Robert O'Callahan, and later I and others were involved), opted to solve this problem by having our implementation ask forgiveness. Once you've entered fullscreen, the user is asked to confirm the action.

This forces the user to approve or deny the request immediately, and this means that script will immediately know whether fullscreen was engaged, so script will know whether it needs to take its fallback path or not.

Note that the specification for requestFullscreen() defines that most of the requestFullscreen() algorithm should run asynchronously, so there is scope to change the fullscreen approval dialog to being a permission request before entering fullscreen instead if future maintainers, or other implementors/browser, wish to do so.

What does the H.264/avc1 codecs parameters for video/mp4 mime types mean?


The HTMLMediaElement.canPlayType() API enables you to query what video formats a user agent can play. For "video/mp4", the container for H.264/AAC, you can specify a "codecs" parameter that denotes the the H264 profile and level. Firefox doesn't currently handle MP4 codecs parameter very well, so I took it upon myself to figure out what the codecs parameters mean for H.264.According to RFC6381 The 'Codecs' and 'Profiles' Parameters for "Bucket" Media Types, codecs paremeters for H.264 are contained in the "avc1" sample entry, and are is represented as follows:avc1.PPCCLLThat is, the string "avc1." (or "avc2.", I'm not sure what the difference is yet), followed by 3 bytes represented in hex without the "0x" prefix, where the bytes represent the following:PP = profile_idcCC = constraint_set flagsLL = level_idcThese fields are defined in in Annex 1 of ITU-T H.264 and ISO/IEC 14496-10:2012  twinned standards. ITU-T H.264 can be downloaded for free.profile_idc defines the H.264 profile. ITU-T H.264 doesn't have a single table listing what the different profile_idc values mean, but handily, Microsoft defines an eAVEncH264VProfile enumeration on the decimal values of the profile_idc in Codecapi.h (available on Win7):enum eAVEncH264VProfile {  eAVEncH264VProfile_unknown                    = 0,   eAVEncH264VProfile_Simple                     = 66,   eAVEncH264VProfile_Base                       = 66,   eAVEncH264VProfile_Main                       = 77,   eAVEncH264VProfile_High                       = 100,   eAVEncH264VProfile_422                        = 122,   eAVEncH264VProfile_High10                     = 110,   eAVEncH264VProfile_444                        = 144,   eAVEncH264VProfile_Extended                   = 88,   eAVEncH264VProfile_ScalableBase               = 83,   eAVEncH264VProfile_ScalableHigh               = 86,   eAVEncH264VProfile_MultiviewHigh              = 118,   eAVEncH264VProfile_StereoHigh                 = 128,   eAVEncH264VProfile_ConstrainedBase            = 256,   eAVEncH264VProfile_UCConstrainedHigh          = 257,   eAVEncH264VProfile_UCScalableConstrainedBase  = 258,   eAVEncH264VProfile_UCScalableConstrainedHigh  = 259 };So for example, avc1.4D401E has a profile_idc of 0x4D, which is 77 in decimal, so it's main profile.constraint_set flags are encoded as a bit flags 6 bitfields named constraint_set0_flag through to constraint_set5_flag. The meaning of a constraint_setN_flag being set depends on the profile being represented. The bits a[...]

Mozilla at the New Zealand Programming Contest


On Saturday Mozilla sponsored the Auckland site of the New Zealand Programming Contest. We supplied t-shirts for the participants, competed in the competition, and we provided pizza for dinner afterwards.

Word must have got out that we were giving away swag, as the contest had double the normal participants than usual, around 120 people, and we ran out of t-shirts!

I've been wanting to do this for a while. I learned a lot about coding during training for the programming contest while I was at university, so I think it's a great way to encourage the next generation to hone their skills.

We're also looking for interns to join us over the summer, so I also took the opportunity to make a plug for our 2013 Mozilla Auckland Internship intake. I think it's a good way to get targeted exposure to the types of people we want to hire too.

We did well in the competition too, largely thanks to Edwin Flores, who formerly represented Australasia at the ACM Programming Contest world finals a few years back. Go Team!

Hardware accelerated H.264 decoding landed in Firefox on Windows Vista and later


I have been hard at work getting our H.264 support on Windows Vista and later hardware accelerated using DXVA2. I'm happy to say that this finally landed in Firefox 23 Nightly builds. With this patch we'll use the GPU to accelerate H.264 video decoding when possible on Windows, which greatly reduces our CPU and power usage.

If you spot a bug in H.264 video in Firefox Nightly builds, please file a bug in Core: Audio/Video. Bonus points if you toggle the pref "" to false, reload the video, and tell me whether the bug still happens!

  1. H.264/AAC/MP3 support on Windows 7 and later is shipping in Release builds in Firefox 21 next week, Vista gets it in Firefox 22.
  2. Edwin Flores is working on our H.264/AAC/MP3 support on Mac and Linux.
  3. I'm currently working on getting MP3 support for Windows XP using DirectShow, it will probably land next cycle (Firefox 24).

Reducing Windows' background CPU load while building Firefox


If you're building Firefox on Windows 8 like I am you might want to tweak the following settings to reduce the OS' background CPU load while you're building Firefox (some of these settings may also be applicable to Windows 7, but I haven't test this):
  1. Add your src/object directory to Windows Defender's list of locations excluded from real time scans. I realized that the "Antimalware Service Executable" was using up to 40% CPU utilization during builds before I did this. You can add your src/objdir to the exclude list using the UI at: Windows Defender > Settings > Excluded files and locations.
  2. Remove your src/objdir from Windows' list of locations to be indexed. I actually did the inverse, and removed my home directory (which my src/objdir was inside) from the list of indexable locations and re-added the specific subdirectories in my home dir that I wanted indexed (Documents, etc) without re-adding my src/objdir.
Update, 11 July 2014: Recently the "Antimalware Service Execuable" started hogging CPU again while building, so I added MSVC's cl.exe, and link.exe, to the list of "Excluded Processes" in Windows Defender > Settings, and that reduced "Antimalware Service Execuable"'s CPU usage while building.

    H.264/AAC/MP3 support now enabled by default in Firefox Nightlies on Windows 7 and later


    Support for playing H.264/AAC in MP4 and support for playing MP3 audio files in HTML5

    HTML5 video playbackRate and Ogg chaining support landed in Firefox


    Paul Adenot has recently landed patches in Firefox to enable the playbackRate attribute on HTML5

    Experimental H.264,AAC, and MP3 support in Firefox Nightly builds on Windows 7 and later


    As the Internet has already discovered, recently I landed patches to add a Windows Media Foundation playback backend for Firefox. This is preff'd off by default.

    This allows playback of H.264 video and AAC audio in MP4 and M4A files, and MP3 audio files in HTML5

    Enabling external monitor on Lenovo W530 with Nvidia Discrete Graphics and Ubuntu Linux 12.04


    I've recently acquired a Lenovo W530 laptop. It's nice. Running Ubuntu 12.04 and with 32GB of RAM and an SSD it builds Firefox with cold/empty caches in 14 minutes flat, which is pretty good!It was a trial figuring out how to get the Nvidia graphics card and a second monitor working however. The current Ubuntu stable nvidia-current package doesn't include support for the Quadro K1000M chipset, and the official Nvidia drivers available from didn't work for me, and left my's missing when I uninstalled them.I eventually figured out the steps to get a compatible Nvidia driver installed and displaying a second monitor:1. Install the "X Updates" team's PPA, and install the latest Nvidia drivers, which supports the Quadro K1000M. In the terminal enter: sudo add-apt-repository ppa:ubuntu-x-swat/x-updatessudo apt-get updatesudo apt-get install nvidia-current 2. Reboot. At the Lenovo BIOS screen press "Enter" to interrupt normal startup, F1 to enter BIOS setup utility. Change Config > Display > Graphics Device to Discrete Graphics. This means the hardware will attempt to use only the Nvidia graphics card, not the Intel integrated graphics. Press F10 to save and boot the computer.3. Once you've booted up and logged in, to setup multiple monitors you'll need to run nvidia-settings from the terminal as root:sudo nvidia-settingsSetup your displays in the "X Server Display Configuration" tab (you may need to "Detect Displays" first, and you probably want "TwinView" configuration). Once you've configured your displays, make sure you "Save to X Configuration File". When I saved my config file the path save was blank. If this happens to you, enter: /etc/X11/xorg.conf3. (Updated 13 October 2012) Once you've booted up and logged in, you can use Ubuntu's "Displays" application to easily configure your secondary displays. Note that if you use nvidia-settings to write an xorg.conf (as I previously suggested) X won't detect when you unplug a monitor, and so when you unplug a monitor X may open windows on the monitor which is no longer connected, i.e. where you can't see them, which can be very inconvenient!4. To get your hardware brightness keys working (i..e Fn+F8/F9), edit the /etc/X11/xorg.conf file as root:sudo gedit /etc/X11/xorg.conf ...and add the following line to the "Device" section for the "Quadro K1000M":    Option         "RegistryDwords" "EnableBrightnessControl=1"Log out/log back in and then your brightness keys should work!4. (Updated 13 October 2012) To get your display brightness keys working (i..e Fn+F8/F9) you need to tell the Nvidia graphics driver to enable the brightness controls.These days X automatically configures itself, so you can't just edit the xorg.conf file, you instead need to add a section to a file in /usr/share/X11/xorg.conf.d/ and X will include that section in the configuration that it automatically generates.So to get the screen brightness keys working with your Nvidia graphics card, create a file in the xorg.conf.d directory, e.g:sudo gedit /usr/share/X11/xorg.conf.d/10-nvidia-brightness.confPaste the following into the file:Section "Device"    Identifier     "Device0"    Driver         "nvidia"    VendorName     "NVIDIA Corporation"    BoardName      "Quadro K1000M"    Option         "RegistryDwords" "EnableBrightnessControl=1"EndSectionLog out and log back in, or reboot, or simply kill X with CTRL+ALT+BACKSPACE, or ALT+PrtSc+K, and your brightness keys sh[...]

    Replacing Lenovo optical drive with second hard drive: The Lenovo adapter is disappointing


    I recently ordered a Lenovo Serial ATA Hard Drive Bay Adapter III for my Lenovo T510 laptop. This can hold a hard drive, and replaces the DVD/CD-ROM drive in your laptop. This enables your laptop to run a second hard drive.I've used my optical drive two, maybe three times since getting the laptop, so swapping it for another hard drive seems like a good trade for me.The Lenovo drive bay itself works fine, but I'm still disappointed in Lenovo's product.When installed, the drive bay looks like this:Lenovo Serial ATA Hard Drive Bay Adapter installed in a Lenovo T510The problem here is that there's a gap of approximately 3mm (~0.12 inches) between the top of the drive bay and the ceiling of the optical disk cavity. This means the drive bay can wobble vertically, so much so that I feel the need to tape it in place to stop it flopping around. This looks ridiculous.Secondly, in order to install your hard drive inside the Lenovo drive bay, you need a hard drive cover. This is the metal cover that encases the hard drives shipping in Lenovo laptops. The covers normally have rubber bumpers/rails to stop the drive moving around. You need to take the bumpers off to install your drive into the hard drive bay.The hard drive covers looks like this:Lenovo hard drive coverAnd with a hard drive in it, the hard drive cover looks like this:Lenovo hard drive cover encasing a hard drive.Note the screws. The drive bay has notches which the screws snap into, holding the drive securely inside the drive bay. Possibly the screws are the only important bit here; you probably don't actually need the drive cover to install the drive into the bay, just the screws since they're what hold the drive in place inside the drive bay.The frustrating thing is that nothing on the Lenovo web site tells you that you need a drive cover to install a drive into the drive bay. My drive bay arrived and I had to loot my old laptop's drive cover in order to install a new drive into my current laptop.And I also couldn't find the drive covers listed on Lenovo's web site. Presumably if you buy a laptop hard drive from Lenovo they come with this cover, and presumably Lenovo use this as a way to force you to buy all your laptop hard drives directly from Lenovo.That's the sort of behaviour I'd expect from Apple, not Lenovo.Thankfully Ann at IT was able to figure out how to order the drive covers separately. Thanks Ann!Overall, the product was easy to install (once I had a drive cover) and works fine (apart from the wobble) but I'm still disappointed. Next time, I'll try one of newmodeus' Lenovo drive caddies:Update 22 March 2015: This blog post now has a Russian translation;Пост доступен на сайте Замена оптического дисковода вторым жестким диском на Lenovo. [...]

    Improved key input in fullscreen mode plus pointer lock changes


    I've landed bug 716107 which removes the "Press ESC to exit fullscreen" warning upon alphanumeric key input in fullscreen mode.

    This will mean fullscreen web apps can use the full range of keys without having the annoying warning message pop up every time the user presses an alphanumeric key, i.e. the WASD keys!

    In order to make change safe, we altered the security model a bit: now when entering fullscreen we explicitly ask the user to approve/deny entering fullscreen using a modal prompt, something like this:

    Fullscreen approval prompt.
    The prompt has a "remember decision for $" checkbox, so if the user trusts the domain they can avoid having to approve fullscreen every time. If the user opts to "remember" an allow fullscreen decision, we'll still show a "$ entered fullscreen, press ESC to exit" warning when entering fullscreen, but it goes away after a few seconds.

    Once the user has approved entering fullscreen, we won't show a warning upon alphanumeric key input.

    I also landed bug 746885 which makes pointer lock wait until fullscreen has been approved using the new approval UI before granting the pointer lock request. It may take the user several seconds to approve fullscreen, so authors need to be aware that the "mozpointerlockchange" event may come in several seconds after the "mozfullscreenchange" event. Authors shouldn't assume the pointer is locked until after they've received a "mozpointerlockchange" event!

    We also changed our spelling to use "fullscreen" rather than "full-screen", since everybody (including the fullscreen draft spec) was spelling it "fullscreen" anyway.

    These changes are in Firefox Nightly builds from 9 May 2012 onwards, and will ship in Firefox 15, which is scheduled for release on 28 August 2012.

    These changes should greatly improve the experience for HTML5 games using fullscreen and pointer lock!

    When you encounter a bug, always file a bug


    If you find a bug in Firefox, or any other software for that matter, please please please file a bug! It's completely possible that the developers simply aren't aware of the bug.

    The developers may not be aware of the bug because they don't run on the same hardware, operating system, or environment as you, or they may not use the browser the same way as you do.

    Even if we already have the bug on file and we mark your bug as a duplicate, at the very least this is allows us to get a coarse feel for how often the bug is encountered in the wild.

    If the developers aren't aware of a bug, there's no way they can fix it! Please file bugs!

    Changes to DOM full-screen API in Firefox 11


    We've made some changes to how the HTML full-screen API exits full-screen mode in Firefox 11, which is scheduled to ship in March 2012. Previously Document.mozCancelFullScreen() would fully-exit full-screen and return the browser to "normal" mode. Starting in Firefox 11, Document.mozCancelFullScreen() will restore full-screen state to the element that was previously full-screen. If there is no previous full-screen element in either the document or a parent document (full-screen mode isn't restored to former full-screen elements in child documents), then the browser will "fully-exit full-screen", and return the browser to normal mode.

    To see how this is useful, consider the case of a PowerPoint clone or presentation web app that wants to run full-screen. One way to implement such a web app would be to have a full-screen
    element where the slides are shown. The developer may want to be able to switch full-screen mode seamlessly between the slide deck
    and (say) a