webgl demo - source
Stitches Google Street View and Photo Sphere tiles into an equirectangular image. For use in the browser, with Webpack or Browserify.
Also includes an intermediate mode for higher quality WebGL rendering on low-end devices.
For full examples, see the demo/ folder, or running from source for details.
Make sure to include the Google Client library first:
Creates a new StreetView stitcher with and optional settings. can be:
- an integer between 0 and 5 (inclusive), defaults to 1
- a HTMLCanvasElement to re-use, defaults to creating a new element
- is the tile dimensions from or , defaults to assuming StreetView image dimensions
- crossOrigin String for image loading, defaults to
- some new StreetView IDs can be fetched by size; if the ID falls into this category, the returned image will ignore the parameter and instead try to find something by dimensions
- the protocol to use when requesting images; defaults to undefined which will load a protocol-relative URL () but you can specify an exact protocol if desired (which may be necessary in some environments like CocoonJS), such as or
Here is an example using google-panorama-by-id.
It's recommended you specify for an accurate result across different image types (panorama, photo sphere, etc).
Called when the stitching process begins, with parameter that includes the output of google-panorama-tiles:
Called after a new tile has been loaded and drawn to the canvas.
In intermediate mode, the might be an Image or a Canvas, depending on crop.
Called when an image is skipped due to it not being found. The is passed.
Called when the stitching is complete. The resulting is passed as the parameter.
In intermediate mode, the passed is the one used during cropping.
The default export stitches all tiles into a single Canvas element. This is convenient, but not ideal for low-end devices like iOS Safari. In some browsers, there is a maximum size for canvas elements, and no way to query this value.
For example, in a 256MB RAM iPhone, the full canvas size must be less than (3 MP).
WebGL applications can leverage "intermediate rendering" mode which keeps no more than a single 512x512 canvas in memory at a time. This allows higher quality panoramas to be stitched on low-end devices. The interface is the same, and can be required like this:
Each event simply returned a cropped image for that tile. You will need to stitch and upload sub-images yourself to WebGL. See demo/gpu.js for an example.
In intermediate mode, the field of events might be a canvas or image, depending on whether a crop was necessary. The is provided to allow access to the HTMLImageElement, so the event data is:
Running From Source
To run the demos from source:
Now run one of the demos:
And open http://localhost:9966/. Changing the source will re-load the browser page.
Thanks to @thespite's prior work on PanomNom.js, which was used as a reference while building these modules.
MIT, see LICENSE.md for details.
- What got Jussie Smollett acquitted
- What is the most viscous gas
- How does KV conduct extra classes
- What is your favorite sauce from McDonalds
- What are the homestay options in Sikkim
- What is semi autonomous
- Why can I send unlimited A2As
- What is the Half Earth project
- Can I carry vegetables in hand baggage
- What number is 75 of 250
- What is Google Campaign Manager
- What are some examples of active volcanoes?no_redirect=1
- What is URL Canonicalization
- Which iot training is good
- What are fair weather friends
- Do you watch Bollywood movies 1