Mozilla DevCenter
oreilly.comSafari Books Online.Conferences.
advertisement

Sponsored Developer Resources

Atom 1.0 Feed RSS 1.0 Feed RSS 2.0 Feed

Related O'Reilly Books





What Is Firefox What Is Firefox
Brian King provides a brief look at Firefox's origins and evolution, and then dives into its support for web standards like CSS and XML, its debugging and extension capabilities, and some cool new features in the upcoming 1.5 release. If you're considering a switch to Firefox, this article may help make the decision for you.


Mozilla as a Development Platform: An Interview with Axel Hecht  Axel Hecht is a member of Mozilla Europe's board of directors, and a major contributor to the Mozilla project. At O'Reilly's European Open Source Convention (October 17-20), Dr. Hecht will be talking about Mozilla as a development platform. O'Reilly Network interviewed Dr. Hecht to find out if the long-held dream of Mozilla as a development platform was about to come true.   [O'Reilly Network]

A Firefox Glossary  Brian King, with some help from Nigel McFarlane, covers everything from about:config to "zool" in this fun, fact-filled Firefox glossary. It's by no means exhaustive, but you'll find references to specific chapters or hacks throughout the glossary to Nigel's book, Firefox Hacks. When you're ready to dig deeper, check out his book.   [O'Reilly Network]

Important Notice for Mozilla DevCenter Readers About O'Reilly RSS and Atom Feeds  O'Reilly Media, Inc. is rolling out a new syndication mechanism that provides greater control over the content we publish online. Here's information to help you update your existing RSS and Atom feeds to O'Reilly content.  [Mozilla DevCenter]

Hacking Firefox  This excerpt from Firefox Hacks shows you how to use overlays (essentially hunks of UI data) to make something you want to appear in the Firefox default application, perhaps to carry out a particular function of your extension. For example, you might want to add a menu item to the Tools menu to launch your extension. Overlays allow existing Firefox GUIs to be enhanced.   [O'Reilly Network]

Mozile: What You See is What You Edit  Most modern browsers don't allow you to hit "edit" and manipulate content as easily as you view it, WYSIWYG-style. Mozile, which stands for Mozilla Inline Editor, is a new Mozilla plug-in for in-browser editing. This article by Conor Dowling provides an overview of Mozile and what in-browser editing means.
  [ Mozilla DevCenter]

The Future of Mozilla Application Development  Recently, mozilla.org announced a major update to its development roadmap. Some of the changes in the new document represent a fundamental shift in the direction and goals of the Mozilla community. In this article, David Boswell and Brian King analyze the new roadmap, and demonstrate how to convert an existing XPFE-based application into an application that uses the new XUL toolkit. David and Brian are the authors of O'Reilly's Creating Applications with Mozilla.   [Mozilla DevCenter]

Remote Application Development with Mozilla, Part 2  In their first article, Brian King, coauthor of Creating Applications with Mozilla, and Myk Melez looked at the benefits of remote application development using Mozilla technologies such as XUL and web services support. In this article, they present a case study of one such application, the Mozilla Amazon Browser, a tool for searching Amazon's catalogs.   [Mozilla DevCenter]

Remote Application Development with Mozilla  This article explores the uses for remote XUL (loaded from a Web server), contrasts its capabilities with those of local XUL (installed on a user's computer), explains how to deploy remote XUL, and gives examples of existing applications.   [Mozilla DevCenter]

Mozdev.org Made Easy  Now that mozilla.org is about to release Mozilla 1.2 and Netscape has come out with the latest version of their own Mozilla-based browser, Netscape 7, this is a great time to see what other people are building with Mozilla's cross-platform development framework. Here's a little history about, and a roadmap to, mozdev.org.   [Mozilla DevCenter]

XML Transformations with CSS and DOM  Mozilla permits XML to be rendered in the browser with CSS and manipulated with DOM. If you're already familiar with CSS and DOM, you're more than halfway to achieving XML transformations in Mozilla. This article demonstrates how to render XML in the browser with a minimum of CSS and JavaScript.   [Mozilla DevCenter]

Roll Your Own Browser  Here's a look at using the Mozilla toolkit to customize, or even create your own browser.   [Mozilla DevCenter]

Let One Hundred Browsers Bloom  In this article, David Boswell, coauthor of Creating Applications with Mozilla surveys some of the more interesting, and useful, Mozilla-based browsers available now.   [Mozilla DevCenter]

Using the Mozilla SOAP API  With the release of Mozilla 1.0, the world now has a browser that supports SOAP natively. This article shows you how Web applications running in Mozilla can now make SOAP calls directly from the client without requiring a browser refresh or additional calls to the server.   [Web Development DevCenter]





Today's News
August 22, 2014

Peter Bengtsson: premailer now with 100% test coverage

One of my most popular GitHub Open Source projects is premailer. It's a python library for combining HTML and CSS into HTML with all its CSS inlined into tags. This is a useful and necessary technique when sending HTML emails because you can't send those with an external CSS file (or even a CSS style tag in many cases).

The project has had 23 contributors so far and as always people come in get some itch they have scratched and then leave. I really try to get good test coverage and when people come with code I almost always require that it should come with tests too.

But sometimes you miss things. Also, this project was born as a weekend hack that slowly morphed into an actual package and its own repository and I bet there was code from that day that was never fully test covered.

So today I combed through the code and plugged all the holes where there wasn't test coverage.
Also, I set up Coveralls (project page) which is an awesome service that hooks itself up with Travis CI so that on every build and every Pull Request, the tests are run with --with-cover on nosetests and that output is reported to Coveralls.

The relevant changes you need to do are:

1) You need to go to coveralls.io (sign in with your GitHub account) and add the repo.
2) Edit your .travis.yml file to contain the following:

before_install:
    - pip install coverage
...
after_success:
    - pip install coveralls
    - coveralls

And you need to execute your tests so that coverage is calculated (the coverage module stores everything in a .coverage file which coveralls analyzes and sends). So in my case I change to this:

script:
    - nosetests premailer --with-cover --cover-erase --cover-package=premailer

3) You must also give coveralls some clues. So it reports on only the relevant files. Here's what mine looked like:

[run]
source = premailer

[report]
omit = premailer/test*

Now, I get to have a cute "coverage: 100%" badge in the README and when people post pull requests Coveralls will post a comment to reflect how the pull request changes the test coverage.

I am so grateful for all these wonderful tools. And it's all free too!

[Source: Planet Mozilla]

Mozilla WebDev Community: Beer and Tell – August 2014

Once a month, web developers from across the Mozilla Project get together to upvote stories on Hacker News from each of our blogs. While we’re together, we usually end up sharing a bit about our side projects over beers, which is why we call this meetup “Beer and Tell”.

There’s a wiki page available with a list of the presenters, as well as links to their presentation materials. There’s also a recording available courtesy of Air Mozilla.

Frederik Braun: Room Availability in the Berlin Office

freddyb shared (via a ghost presentation by yours truly) a small webapp he made that shows the current availability of meeting rooms in the Mozilla Berlin office. The app reads room availability from Zimbra, which Mozilla uses for calendaring and booking meeting rooms. It also uses moment.js for rendering relative dates to let you know when a room will be free.

The discussion following the presentation brought up a few similar apps that other Mozilla offices had made to show off their availability, such as the Vancouver office’s yvr-conf-free and the Toronto office’s yyz-conf-free.

Nigel Babu: hgstats

nigelb shared (via another ghost presentation, this time split between myself and laura) hgstats, which shows publicly-available graphs of the general health of Mozilla’s mercurial servers. This includes CPU usage, load, swap, and more. The main magic of the app is to load images from graphite, which are publicly visible, while graphite itself isn’t.

nigelb has offered a bounty of beer for anyone who reviews the app code for him.

Pomax: Inkcyclopedia

Pomax shared an early preview of Inkcyclopedia, an online encyclopedia of ink colors. Essentially, Pomax bought roughly 170 different kinds of ink, wrote down samples with all of them, photographed them, and then collected those images along with the kind of ink used for each. Once finished, the site will be able to accept user-submitted samples and analyze them to attempt to identify the color and associate it with the ink used. Unsurprisingly, the site is able to do this using the RGBAnalyse library that Pomax shared during the last Beer and Tell, in tandem with RgbQuant.js.

Sathya Gunasekaran: screen-share

gsathya shared a screencast showing off a project that has one browser window running a WebGL game and sharing its screen with another browser window via WebRTC. The demo currently uses Chrome’s desktopCapture API for recording the screen before sending it to the listener over WebRTC.


Alas, we were unable to beat Hacker News’s voting ring detection. But at least we had fun!

If you’re interested in attending the next Beer and Tell, sign up for the dev-webdev@lists.mozilla.org mailing list. An email is sent out a week beforehand with connection details. You could even add yourself to the wiki and show off your side-project!

See you next month!

[Source: Planet Mozilla]

Advancing Content: A Call for Trust, Transparency and User Control in Advertising

Advertising is the Web’s dominant business.  It relies on users for its success, and ironically fails to engage with them in a direct and honest way.  We are advocates of the many benefits that commercial involvement brings to the development of the Internet – it is at our core and part of the Mozilla Manifesto. Advertising is one of those commercial activities, it fuels and grows the Web. But the model has lost its focus by failing to put the user at the center.  We are calling initially on the advertising industry to adopt three core principles of trust, transparency and user control:

1)  Trust: Do users understand why they are being presented with content? Do they understand what pieces of their data fed into the display decision?

2)  Transparency: Is it clear to users why advertising decisions are made? Is it clear how their data is being consumed and shared?  Are they aware and openly contributing?

3)  Control: Do users have the ability to control their own data? Do they have the option to be completely private, completely public or somewhere in between?

We are re-thinking the model.  We want a world where Chief Marketing Officers, advertising agency executives, industry groups and the advertising technology companies see the real benefits of a user-centric model. These three principles give us the ability to build a strong, long term and more valuable platform for everyone.

What are we doing?

Our intention is to improve the experience as a player within the ecosystem. We’ll do this by experimenting and innovating.  All of our work will be designed with trust in mind.  Tiles is our first experiment and we are learning a lot.  Right now, we are showing users tiles from their “frecency” (recent and frequent sites), along with Mozilla information and suggestions and content labeled as sponsored. This experience is pretty basic but will evolve over time. Initial user interactions are positive. Users interacted with content labeled as sponsored that we placed in directory tiles 10x more than Mozilla-based content.

Our next step will be to give users more transparency and control. Our UP platform will eventually help to power tiles and will help determine which content is displayed to the user.  The platform itself is innovative as it currently allows the interests data to sit client side, completely in the user’s control. The data can still be accessed there without us creating a dossier on the user, outside of the Firefox client.

We will then put the user first by building an interests dashboard (something that we are already working on) that offers users a way to easily change their interests or participation in enhanced content at any time. The dashboard provides a constant feedback loop with users and will work with all our enhanced content projects.

What can we promise?

We will continue to demonstrate that it’s possible to balance commercial interests with public benefit, and to build successful products that respect user privacy and deliver experiences based upon trust, transparency and control.

  • We want to show the world you can do display advertising in a way that respects users’ privacy.
  • We believe that publishers should respect browser signals around tracking and privacy. If they don’t, we’ll take an active role in doing so and all our enhanced content projects will respect DNT.
  • We will respect the Minimal Actionable Dataset, a thought stream pioneered by one of our fellow Mozillians to only collect what’s needed – nothing more – and be transparent about it.
  • We will put users in control to customize, change or turn product features on/off at any time.

We can’t change the Web from the sidelines, and we can’t change advertising on the Web without being a part of that ecosystem. We are excited about this mission and we’re working hard to achieve our goals. Stay tuned for updates over the coming weeks.

If this resonates with and you have ideas or want to help, we’d love to hear from you by leaving comments below or by filling out this form.

[Source: Planet Mozilla]

Mozilla Open Policy & Advocacy Blog: Trust should be the currency

At Mozilla, we champion a Web  that empowers people to reach their full potential and be in control of their online lives. In my role at Mozilla this means advocating for products, policies and practices that respect our users and create trusted online environments and experiences.  We believe trust is the most important currency on the Web – and when that trust is violated, the system fails.

I have been spending a lot of time with our Content Services team as they work on their new initiatives.  Their first challenge is tackling the online advertising ecosystem.  This is hard work but extremely important.  Our core values of trust, transparency and control are just as applicable to the advertising industry as to any other, but they aren’t widely adopted there.

Today, online advertising is rife with mistrust.  It is opaque for most users because the value exchange is not transparent.  While it should be trust, the prevailing Web currency is user data – much of the content is free because publishers and websites generate revenue through advertising.  At its core, this model is not new or unique, it is common in the media industry (e.g., broadcast television commercials and newspapers that are ad supported).  To improve monetization, online ads are now targeted based on a user’s browsing habits and intentions.  This isn’t a bad thing when done openly or done with consent.  The problem is that this “personalization” is not always transparent, leaving users in the dark about what they have traded for their content.  This breaks the system.

Our users and our community have told us – through surveys, comments and emails – that transparency and control matter most to them when it comes to online advertising.  They want to know what is happening with their data; they want to control what data is shared, understand how their data is used and what they get for that exchange.  They are willing to engage in the value exchange and allow their data to be used if they understand what happens next.  Our users want trust (and not their data) to be the prevailing currency.  We believe that without this shift in focus, users will limit access to their data and will block ads.

We want our users to not only trust us but to be able to trust the Web. We want to empower their choices and help them control their online experience. This is why we pioneered the Do Not Track (DNT) initiative.  DNT relies on advertisers, publishers and websites to respect a user’s preference. Unfortunately, many participants in the online advertising ecosystem do not modify their behavior in response to the DNT signal.  In this instance, user choice is not being respected.  So, we must do more for the user and continue to innovate.

We are doing this by working within the ecosystem to create change.  We are testing our new tiles feature in Firefox and working to ensure that it provides personalization with respect and transparency built in. We are building DNT and other user controls into the tiles experiments and working to establish these foundational elements with our partners.  We are providing users with more information about their Web presence through Lightbeam, and will be testing new privacy initiatives that give users more control over the flow of their data.  We want to bring relevant and personalized content to our users while empowering control that inspires trust.

We need to see a renewed focus of trust, transparency and control on the Web as a whole.  We can all do better.  We want to see more products and services (and not just in online advertising) developed with those ideals in mind.  For our part, we will continue to do more to innovate and create change so that we deserve your trust.

 

[Source: Planet Mozilla]

Aaron Klotz: Profile Unlocking in Firefox 34 for Windows

Today’s Nightly 34 build includes the work I did for bug 286355: a profile unlocker for our Windows users. This should be very helpful to those users whose workflow is interrupted by a Firefox instance that cannot start because a previous Firefox instance has not finished shutting down.

Firefox 34 users running Windows Vista or newer will now be presented with this dialog box:

Clicking “Close Firefox” will terminate that previous instance and proceed with starting your new Firefox instance.

Unfortunately this feature is not available to Windows XP users. To support this feature on Windows XP we would need to call undocumented API functions. I prefer to avoid calling undocumented APIs when writing production software due to the potential stability and compatibility issues that can arise from doing so.

While this feature adds some convenience to an otherwise annoying issue, please be assured that the Desktop Performance Team will continue to investigate and fix the root causes of long shutdowns so that a profile unlocker hopefully becomes unnecessary.

[Source: Planet Mozilla]

Pete Moore: Weekly review 2014-08-21

Highlights since last review

  • Wrote Android Play Store code, got r+ from Rail
  • Set up staging environment, staging release hopefully today
  • Solved pip install problems

Goals for next week:

  • Get back to vcs sync work

Bugs I created since last review:

Other bugs I updated since last review:

[Source: Planet Mozilla]

Doug Belshaw: Some preliminary thoughts toward v2.0 of Mozilla's Web Literacy Map

As we approach the Mozilla Festival 2014, my thoughts are turning towards revisiting the Web Literacy Map. This, for those who haven’t seen it, comprises the skills and competencies Mozilla and a community of stakeholders believe to be important to read, write and participate on the web. Now that we’ve had time to build and iterate on top of the first version, it’s time to start thinking about a v2.0.

Thinking

The first thing to do when revisiting something like this is to celebrate the success it’s had: webmaker.org/resources is now structured using the 15 competencies identified in v1.1 of the Web Literacy Map. Each of those competencies now has an associated badge. We’ve published a whitepaper entitled Why Mozilla care about Web Literacy that features in which it features heavily. It’s also been used as the basis of the Boys and Girls Clubs of America’s new technology strategy, and by MOUSE in their work around Privacy. That’s just a few examples amongst the countless other times it’s been shared on social media and by people looking for something more nuanced than the usual new literacies frameworks.

Deadlines being what they are, the group that were working on the Web Literacy Map had to move a bit more quickly than we would have liked in the final stages of putting it together. As a result, although the 15 competencies are reasonably solid, we were never 100% happy with the description of the skills underpinning each of these. Nevertheless, we decided to roll with it for launch, made a few updates post-MozFest, and then ‘froze’ development so that others could build on top of it.

At the beginning of 2014, the Open Badges work at Mozilla was moved to a new non-profit called the Badge Alliance. As co-chair of the working group on Digital & Web Literacies, I’ve had a chance to think through web literacy from the perspective of a badged learning pathway with some of the people who helped put together the Web Literacy Map.

The feeling I get is that with version 2.0 we need to address both the issues we put to one side for the sake of expediency, as well as issues that have cropped up since them. I can name at least five (not listed in any order):

  • Identity
  • Storytelling
  • Protecting the web (e.g. Net Neutrality)
  • Mobile
  • Computer Science

We’re generally happy with the 15 competencies identified in v1.1 of the Web Literacy Map, and we’ve built resources and badges on top of them. Version 2.0, therefore, is likely to be more about evolution, not revolution.

If you’ve got any thoughts on this, please do add them to this thread. Alternatively, I’m @dajbelshaw on Twitter and you can email me at doug@mozillafoundation.org

[Source: Planet Mozilla]

Adam Lofting: Overlapping types of contribution

Screen Shot 2014-08-21 at 14.02.27TL;DR: Check out this graph!

Ever wondered how many Mozfest Volunteers also host events for Webmaker? Or how many code contributors have a Webmaker contributor badge? Now you can find out

The reason the MoFo Contributor dashboard we’re working from at the moment is called our interim dashboard is because it’s combining numbers from multiple data sources, but the number of contributors is not de-duped across systems.

So if you’re counted as a contributor because you host an event for Webmaker, you will be double counted if you also file bugs in Bugzilla. And until now, we haven’t known what those overlaps look like.

This interim solution wasn’t perfect, but it’s given us something to work with while we’re building out Baloo and the cross-org areweamillionyet.org (and by ‘we’, the vast credit for Baloo is due to our hard working MoCo friends Pierros and Sheeri).

To help with prepping MoFo data for inclusion in Baloo, and by  generally being awesome, JP wired up an integration database for our MoFo projects (skipping a night of sleep to ship V1!).

We’ve tweaked and tuned this in the last few weeks and we’re now extracting all sorts of useful insights we didn’t have before. For example, this integration database is behind quite a few of the stats in OpenMatt’s recent Webmaker update.

The downside to this is we will soon have a de-duped number for our dashboard, which will be smaller than the current number. Which will feel like a bit of a downer because we’ve been enthusiastically watching that number go up as we’ve built out contribution tracking systems throughout the year.

But, a smaller more accurate number is a good thing in the long run, and we will also gain new understanding about the multiple ways people contribute over time.

We will be able to see how people move around the project, and find that what looks like someone ‘stopping’ contributing, might be them switching focus to another team, for example. There are lots of exciting possibilities here.

And while I’m looking at this from a metrics point of view today, the same data allows us to make sure we say hello and thanks to any new contributors who joined this week, or to reach out and talk to long running active contributors who have recently stopped, and so on.

[Source: Planet Mozilla]

Marco Zehe: Blog maintenance on Saturday

On Saturday, August 23, starting at 9 AM GMT+02:00 (3 AM Eastern, midnight Pacific), this blog will undergo some much needed maintenance. Afterwards it will hopefully be faster, and also have a new theme. I’ll try to keep the interruption as brief as possible. But just in case, so you know. :)

[Source: Planet Mozilla]

Jen Fong-Adwent: revisit.link A little over 3 years ago, I was learning node and wanted to try a project with it. [Source: Planet Mozilla]

Peter Bengtsson: Aggressively prefetching everything you might click

I just rolled out a change here on my personal blog which I hope will make my few visitors happy.

Basically; when you hover over a link (local link) long enough it prefetches it (with AJAX) so that if you do click it's hopefully already cached in your browser.

If you hover over a link and almost instantly hover out it cancels the prefetching. The assumption here is that if you deliberately put your mouse cursor over a link and proceed to click on it you want to go there. Because your hand is relatively slow I'm using the opportunity to prefetch it even before you have clicked. Some hands are quicker than others so it's not going to help for the really quick clickers.

What I also had to do was set a Cache-Control header of 1 hour on every page so that the browser can learn to cache it.

The effect is that when you do finally click the link, by the time your browser loads it and changes the rendered output it'll hopefully be able to do render it from its cache and thus it becomes visually ready faster.

Let's try to demonstrate this with this horrible animated gif:
(or download the screencast.mov file)

Screencast
1. Hover over a link (in this case the "Now I have a Gmail account" from 2004)
2. Notice how the Network panel preloads it
3. Click it after a slight human delay
4. Notice that when the clicked page is loaded, its served from the browser cache
5. Profit!

So the code that does is is quite simply:

$(function() {
  var prefetched = [];
  var prefetch_timer = null;
  $('div.navbar, div.content').on('mouseover', 'a', function(e) {
    var value = e.target.attributes.href.value;
    if (value.indexOf('/') === 0) {
      if (prefetched.indexOf(value) === -1) {
        if (prefetch_timer) {
          clearTimeout(prefetch_timer);
        }
        prefetch_timer = setTimeout(function() {
          $.get(value, function() {
            // necessary for $.ajax to start the request :(
          });
          prefetched.push(value);
        }, 200);
      }
    }
  }).on('mouseout', 'a', function(e) {
    if (prefetch_timer) {
      clearTimeout(prefetch_timer);
    }
  });
});

Also, available on GitHub.

I'm excited about this change because of a couple of reasons:

  1. On mobile, where you might be on a non-wifi data connection you don't want this. There you don't have the mouse event onmouseover triggering. So people on such devices don't "suffer" from this optimization.
  2. It only downloads the HTML which is quite light compared to static assets such as pictures but it warms up the server-side cache if needs be.
  3. It's much more targetted than a general prefetch meta header.
  4. Most likely content will appear rendered to your eyes faster.
[Source: Planet Mozilla]

Mike Shal: PGO Performance on SeaMicro Build Machines Let's take a look at why our SeaMicro (sm) build machines perform slower than our iX machines. In particular, the extra time it takes to do non-unified PGO Windows builds can cause timeouts in certain cases (on Aurora we have bug 1047621). Since this was a learning experience for me and I hit a few roadblocks along the way, I thought it might be useful to share the experience of debugging the issue. Read on for more details! [Source: Planet Mozilla]

David Boswell: Quality over Quantity

I was in Portland last week for a work week and Michelle recommended that I try the donuts at Blue Star. The blueberry donut was really great. The inside of the bakery was interesting too—right inside the doors was a big mural that said ‘Quality over Quantity’.

20140812_085436

That turned out to be an good summary of the work week. We were checking in on progress toward this year’s goal to grow the number of active contributors by 10x and also thinking about how we could increase the impact of our community building work next year.

One clear take-away was that community building can’t be all about growth. Some teams, like Location Service, do need large numbers of new active contributors, but many teams don’t. For instance, localization needs to develop the active contributors already in the project into core contributors that can take on a bigger role.

For me, creating a draft framework that would give us more ways to support teams and communities was the most important thing we did—in addition to taking a great team photo :)

cbt_portland_photo_fun

Growth is part of this framework, but it includes other factors for us to look at to make sure that we’re building healthy functional and regional communities. The health measures we think we should be focusing on next year are:

  • Retention (how many contributors are staying and leaving)
  • Growth (how many new contributors are joining)
  • Development (how many contributors are getting more deeply involved in a project)
  • Sentiment (how do contributors feel about being involved)
  • Capacity (how are teams increasing their ability to build communities)

Having this more nuanced approach to community building will create more value because it aligns better with the needs we’re seeing across Mozilla. The growth work we’ve done has been critical to getting us here and we should continue that along with adding more to what we offer.

scubidiver_video_poster

There is a video that Rainer just posted that has a story Chris Hofmann told at last year’s summit about one contributor that had a huge impact on the project. This is a great example of how we should be thinking more broadly about community building.

We should be setting up participation systems that let us help teams build long-lasting relationships with contributors like Scoobidiver as well as helping teams connect with large numbers of people to focus on an issue for a short time when that is what’s needed.

Moral of this story: Eat more donuts—they help you think :)


[Source: Planet Mozilla]

Vladimir Vukićević: Updated Firefox VR Builds

I’d like to announce the third Firefox Nightly build with experimental VR support. Download links:

This build includes a number of fixes to CSS VR rendering, as well as some API additions and changes:

  • Fixed CSS rendering (see below for more information)
  • Support for DK2 via 0.4.1 SDK (extended mode only)
  • Experimental auto-positioning on MacOS X — when going fullscreen, the window should move itself to the Rift automatically
  • hmd.setFieldOfView() now takes zNear and zFar arguments
  • New API call: hmd.getRecommendedEyeRenderRect() returns the suggested render dimensions for a given eye; useful for WebGL rendering (see below)

The DK2 Rift must be in Extended Desktop mode. You will also need to rotate the Rift’s display to landscape. If tracking doesn’t seem to be working, stop the Oculus service using the Configuration Tool first, then launch Firefox.

CSS Rendering

Many issues with CSS rendering were fixed in this release. As part of this, the coordinate space when in fullscreen VR is different than normal CSS. When in fullscreen VR mode, the 0,0,0 coordinate location refers to the center of the viewport (and not the top left as is regular in CSS). Additionally, the zNear/zFar values specified to setFieldOfView control the near and far clipping planes.

The coordinate units are also not rationalized with CSS coordinates. The browser applies a per-eye transform in meters (~ 0.032 meters left/right, or 3.2cm) before rendering the scene; tthus the coordinate space ends up being ~1px = ~1m in real space, which is not correct. This will be fixed in the next release.

Here’s a simple example of showing 4 CSS images on all sides around the viewer, along with some text. The source includes copious comments about what’s being done and why.

Known issues:

  • The Y axis is flipped in the resulting rendering. (Workaround: add a rotateZ() to the camera transform div)
  • The initial view doesn’t face the same direction as CSS (Workaround: add a rotateY() to the camera transform div)
  • Manual application of the HMD orientation/position is required.
  • Very large CSS elements (>1000px in width/height) may not be rendered properly
  • Units are not consistent when in VR mode

getRecommendedEyeRenderRect()

NOTE: This API will likely change (and become simpler) in the next release.

getRecommendedEyeRenderRect will return the rectangle into which each eye should be rendered, and the best resolution for the given field of view settings. To create an appropriately sized canvas, the size computation should be:

var leftRect = hmd.getRecommendedEyeRenderRect("left");
var rightRect = hmd.getRecommendedEyeRenderRect("right");
var width = leftRect.x + Math.max(leftRect.width + rightRect.x) + rightRect.width;
var height = Math.max(leftRect.y, rightRect.y) + Math.max(leftRect.height, leftRect.height);

In practice, leftRect.x will be 0, and the y coordinates will both be 0, so this can be simplified to:

var width = leftRect.width + rightRect.width;
var height = Math.max(leftRect.height, rightRect.height);

Each eye should be rendered into the leftRect and rightRect coordinates. This API will change in the next release to make it simpler to obtain the appropriate render sizes and viewports.

Comments and Issues

As before, issues are welcome via GitHub issues on my gecko-dev repo. Additionally, discussion is welcome on the web-vr-discuss mailing list.

[Source: Planet Mozilla]

Christian Heilmann: No more excuses – subtitle your YouTube videos

I was just very pleasantly surprised that the subtitling interface in YouTube has gone leaps and bounds since I last looked at it.

One of the French contributors to Mozilla asked me to get subtitles for the video of the Flame introduction videos and I felt the sense of dread you get when requests like those come in. It seems a lot of work for not much gain.

However, using the YouTube auto captioning tool this is quite a breeze:

subtitling-interface

I just went to the Subtitles and CC tab and told YouTube that the video is English. Almost immediately (this is kind of fishy – does YouTube already create text from speech for indexing reasons?) I got a nice set of subtitles, time-stamped and all.

Hitting the edit button I was able to edit the few mistakes the recognition made and it was a simple process of listening as you type. I then turned on the subtitles and exported the SRT files for translation.

I was very impressed with the auto-captioning as I am not happy with the quality of my talking in those videos (they were rushed and the heartless critic in me totally hears that).

Of course, there is also Amara as a full-fledged transcribing, captioning and translation tool, but there are not many excuses left for us not to subtitle our short videos.

Let’s not forget that subtitles are amazing and not only a tool for the hard of hearing:

  • I don’t have to put my headphones in when watching your video in public – I can turn off the sound and not annoy people in the cafe
  • As a non-native speaker they are great to learn a new language (I learned English watching Monty Python’s Flying Circus with subtitles – the only program that did that back then in Germany. This might explain a few things)
  • You can search a video by content without having to know the time stamp and you can provide the subtitles as a transcript in a post
  • You help people with various disabilities to make your work understandable.

Go, hit that Subtitles tab!

[Source: Planet Mozilla]

More News


Sponsored by: