Merge pull request #3527 from elemoine/makefile
Replace pake with make?
This commit is contained in:
@@ -9,8 +9,8 @@ before_script:
|
||||
- "rm src/ol/renderer/webgl/*shader.js"
|
||||
- "sh -e /etc/init.d/xvfb start"
|
||||
|
||||
script: "./build.py ci"
|
||||
script: "make ci"
|
||||
|
||||
after_success:
|
||||
- "npm run test-coverage"
|
||||
- "make test-coverage"
|
||||
- "cat coverage/lcov.info | ./node_modules/.bin/coveralls"
|
||||
|
||||
@@ -58,16 +58,17 @@ as described below.
|
||||
|
||||
The minimum requirements are:
|
||||
|
||||
* GNU Make
|
||||
* Git
|
||||
* [Node.js](http://nodejs.org/) (0.10.x or higher)
|
||||
* Python 2.6 or 2.7 with a couple of extra modules (see below)
|
||||
* Java 7 (JRE and JDK)
|
||||
|
||||
The executables `git`, `java`, `jar`, and `python` should be in your `PATH`.
|
||||
The executables `git`, `node`, `python` and `java` should be in your `PATH`.
|
||||
|
||||
You can check your configuration by running:
|
||||
|
||||
$ ./build.py checkdeps
|
||||
$ make check-deps
|
||||
|
||||
To install the Node.js dependencies run
|
||||
|
||||
@@ -82,24 +83,22 @@ or
|
||||
|
||||
depending on your OS and Python installation.
|
||||
|
||||
(You can also install the Python modules in a Python virtual environment if you want to.)
|
||||
|
||||
## Working with the build tool
|
||||
|
||||
As an ol3 developer you will need to use the `build.py` Python script. This is
|
||||
the script to use to run the linter, the compiler, the tests, etc. Windows users
|
||||
can use `build.cmd` which is a thin wrapper around `build.py`.
|
||||
As an ol3 developer you will use `make` to run build targets defined in the
|
||||
`Makefile` located at the root of the repository. The `Makefile` includes
|
||||
targets for running the linter, the compiler, the tests, etc.
|
||||
|
||||
The `build.py` script is equivalent to a Makefile. It is actually based on
|
||||
[pake](https://github.com/twpayne/pake/), which is a simple implementation of
|
||||
`make` in Python.
|
||||
The usage of `make` is as follows:
|
||||
|
||||
The usage of the script is:
|
||||
|
||||
$ ./build.py <target>
|
||||
$ make <target>
|
||||
|
||||
where `<target>` is the name of the build target you want to execute. For
|
||||
example:
|
||||
|
||||
$ ./build.py test
|
||||
$ make test
|
||||
|
||||
The main build targets are `serve`, `lint`, `build`, `test`, and `check`. The
|
||||
latter is a meta-target that basically runs `lint`, `build`, and `test`.
|
||||
@@ -116,7 +115,7 @@ and have therefore no chance of being merged into `master`.
|
||||
|
||||
To run the `check` target:
|
||||
|
||||
$ ./build.py check
|
||||
$ make check
|
||||
|
||||
If you want to run the full suite of integration tests, see "Running the integration
|
||||
tests" below.
|
||||
@@ -125,29 +124,29 @@ tests" below.
|
||||
|
||||
To run the examples you first need to start the dev server:
|
||||
|
||||
$ ./build.py serve
|
||||
$ make serve
|
||||
|
||||
Then, just point your browser <http://localhost:3000/examples> in your browser. For example <http://localhost:3000/examples/side-by-side.html>.
|
||||
Then, just point your browser <http://localhost:3000/build/examples> in your browser. For example <http://localhost:3000/build/examples/side-by-side.html>.
|
||||
|
||||
Run examples against the `ol.js` standalone build:
|
||||
|
||||
The examples can also be run against the `ol.js` standalone lib, just like the examples
|
||||
[hosted](http://openlayers.github.com/ol3/master/examples/) on GitHub. Start by
|
||||
executing the `host-examples` build target:
|
||||
The examples can also be run against the `ol.js` standalone build, just like
|
||||
the examples [hosted](http://openlayers.org/en/master/examples/) on GitHub.
|
||||
Start by executing the `host-examples` build target:
|
||||
|
||||
$ ./build.py host-examples
|
||||
$ make host-examples
|
||||
|
||||
After running `host-examples` you can now open the examples index page in the browser, for example: <http://localhost/~elemoine/ol3/build/hosted/master/examples/>. (This assumes that the `hosted` directory is a web directory, served by Apache for example.)
|
||||
After running `host-examples` you can now open the examples index page in the browser: <http://localhost:3000/build/hosted/master/examples/>. (This assumes that you still have the dev server running.)
|
||||
|
||||
Append `?mode=raw` to make the example work in full debug mode. In raw mode the OpenLayers and Closure Library scripts are loaded individually by the Closure Library's `base.js` script (which the example page loads and executes before any other script).
|
||||
|
||||
## Running tests
|
||||
|
||||
To run the tests in a browser start the dev server (`./build.py serve`) and open <http://localhost:3000/test/index.html> in the browser.
|
||||
To run the tests in a browser start the dev server (`make serve`) and open <http://localhost:3000/test/index.html> in the browser.
|
||||
|
||||
To run the tests on the console (headless testing with PhantomJS) use the `test` target:
|
||||
|
||||
$ ./build.py test
|
||||
$ make test
|
||||
|
||||
See also the test-specific [README](../master/test/README.md).
|
||||
|
||||
@@ -161,7 +160,7 @@ displayed in the pull request.
|
||||
|
||||
To run the full suite of integration tests use the `ci` target:
|
||||
|
||||
$ ./build.py ci
|
||||
$ make ci
|
||||
|
||||
Running the full suite of integration tests currently takes 5-10 minutes.
|
||||
|
||||
@@ -174,13 +173,8 @@ Adding functionality often implies adding one or several examples. This
|
||||
section provides explanations related to adding examples.
|
||||
|
||||
The examples are located in the `examples` directory. Adding a new example
|
||||
implies creating two files in this directory, an `.html` file and a `.js` file.
|
||||
See `examples/simple.html` and `examples/simple.js` for instance.
|
||||
|
||||
The `.html` file needs to include a script tag with
|
||||
`loader.js?id=<example_name>` as its `src`. For example, if the two files for
|
||||
the example are `myexample.js` and `myexample.html` then the script tag's `src`
|
||||
should be set to `myexample`.
|
||||
implies creating two or three files in this directory, an `.html` file, a `.js`
|
||||
file, and, optionally, a `.css` file.
|
||||
|
||||
You can use `simple.js` and `simple.html` as templates for new examples.
|
||||
|
||||
@@ -218,7 +212,7 @@ Your pull request must:
|
||||
|
||||
It is strongly recommended that you run
|
||||
|
||||
$ ./build.py check
|
||||
$ make check
|
||||
|
||||
before every commit. This will catch many problems quickly, and it is much
|
||||
faster than waiting for the Travis CI integration tests to run.
|
||||
@@ -238,9 +232,9 @@ Guide](http://google-styleguide.googlecode.com/svn/trunk/javascriptguide.xml).
|
||||
This is checked using the [Closure
|
||||
Linter](https://developers.google.com/closure/utilities/) in strict mode. You
|
||||
can run the linter locally on your machine before committing using the `lint`
|
||||
target to `build.py`:
|
||||
target:
|
||||
|
||||
$ ./build.py lint
|
||||
$ make lint
|
||||
|
||||
In addition to fixing problems identified by the linter, please also follow the
|
||||
style of the existing OpenLayers 3 code, which includes:
|
||||
@@ -279,7 +273,7 @@ The integration tests contain a number of automated checks to ensure that the
|
||||
code follows the OpenLayers 3 style and does not break tests or examples. You
|
||||
can run the integration tests locally using the `ci` target:
|
||||
|
||||
$ ./build.py ci
|
||||
$ make ci
|
||||
|
||||
|
||||
### Address a single issue or add a single item of functionality
|
||||
|
||||
321
Makefile
Normal file
321
Makefile
Normal file
@@ -0,0 +1,321 @@
|
||||
OS := $(shell uname)
|
||||
BRANCH := $(shell git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
SRC_GLSL := $(shell find src -type f -name '*.glsl')
|
||||
SRC_SHADER_JS := $(patsubst %.glsl,%shader.js,$(SRC_GLSL))
|
||||
SRC_JS := $(filter-out $(SRC_SHADER_JS),$(shell find src -name '*.js'))
|
||||
SRC_JSDOC = $(shell find src -type f -name '*.jsdoc')
|
||||
|
||||
SPEC_JS := $(shell find test/spec -type f -name '*.js')
|
||||
SPEC_RENDERING_JS := $(shell find test_rendering/spec -name '*.js')
|
||||
|
||||
EXAMPLES := $(shell find examples -type f)
|
||||
EXAMPLES_HTML := $(filter-out examples/index.html,$(shell find examples -maxdepth 1 -type f -name '*.html'))
|
||||
EXAMPLES_JS := $(patsubst %.html,%.js,$(EXAMPLES_HTML))
|
||||
|
||||
BUILD_EXAMPLES := $(subst examples,build/examples,$(EXAMPLES))
|
||||
|
||||
BUILD_HOSTED := build/hosted/$(BRANCH)
|
||||
BUILD_HOSTED_EXAMPLES := $(addprefix $(BUILD_HOSTED)/,$(EXAMPLES))
|
||||
BUILD_HOSTED_EXAMPLES_JS := $(addprefix $(BUILD_HOSTED)/,$(EXAMPLES_JS))
|
||||
|
||||
CHECK_EXAMPLE_TIMESTAMPS = $(patsubst examples/%.html,build/timestamps/check-%-timestamp,$(EXAMPLES_HTML))
|
||||
|
||||
TASKS_JS := $(shell find tasks -name '*.js')
|
||||
|
||||
CLOSURE_LIB = $(shell node -e 'process.stdout.write(require("closure-util").getLibraryPath())')
|
||||
|
||||
ifeq ($(OS),Darwin)
|
||||
STAT_COMPRESSED = stat -f ' compressed: %z bytes'
|
||||
STAT_UNCOMPRESSED = stat -f 'uncompressed: %z bytes'
|
||||
else
|
||||
STAT_COMPRESSED = stat -c ' compressed: %s bytes'
|
||||
STAT_UNCOMPRESSED = stat -c 'uncompressed: %s bytes'
|
||||
endif
|
||||
|
||||
.PHONY: default
|
||||
default: help
|
||||
|
||||
.PHONY: help
|
||||
help:
|
||||
@echo
|
||||
@echo "The most common targets are:"
|
||||
@echo
|
||||
@echo "- install Install node dependencies"
|
||||
@echo "- serve Start dev server for running examples and tests"
|
||||
@echo "- test Run unit tests in the console"
|
||||
@echo "- check Perform a number of checks on the code"
|
||||
@echo "- clean Remove generated files"
|
||||
@echo "- help Display this help message"
|
||||
@echo
|
||||
@echo "Other less frequently used targets are:"
|
||||
@echo
|
||||
@echo "- build Build ol.js, ol-debug.js, ol.js.map and ol.css"
|
||||
@echo "- lint Check the code with the linter"
|
||||
@echo "- ci Run the full continuous integration process"
|
||||
@echo "- apidoc Build the API documentation using JSDoc"
|
||||
@echo "- cleanall Remove all the build artefacts"
|
||||
@echo "- check-deps Check if the required dependencies are installed"
|
||||
@echo
|
||||
|
||||
.PHONY: apidoc
|
||||
apidoc: build/timestamps/jsdoc-$(BRANCH)-timestamp
|
||||
|
||||
.PHONY: build
|
||||
build: build/ol.css build/ol.js build/ol-debug.js build/ol.js.map
|
||||
|
||||
.PHONY: check
|
||||
check: lint build/ol.js test
|
||||
|
||||
.PHONY: check-examples
|
||||
check-examples: $(CHECK_EXAMPLE_TIMESTAMPS)
|
||||
|
||||
.PHONY: check-deps
|
||||
check-deps: EXECUTABLES = git node python java
|
||||
check-deps:
|
||||
@for exe in $(EXECUTABLES) ;\
|
||||
do \
|
||||
which $${exe} > /dev/null && \
|
||||
echo "Program $${exe} OK" || \
|
||||
echo "Program $${exe} MISSING!" ;\
|
||||
done ;\
|
||||
|
||||
.PHONY: ci
|
||||
ci: lint build test test-rendering compile-examples check-examples apidoc
|
||||
|
||||
.PHONY: compile-examples
|
||||
compile-examples: build/compiled-examples/all.combined.js
|
||||
|
||||
.PHONY: clean
|
||||
clean:
|
||||
rm -f build/timestamps/gjslint-timestamp
|
||||
rm -f build/timestamps/jshint-timestamp
|
||||
rm -f build/timestamps/check-*-timestamp
|
||||
rm -f build/ol.css
|
||||
rm -f build/ol.js
|
||||
rm -f build/ol.js.map
|
||||
rm -f build/ol-debug.js
|
||||
rm -f build/test_requires.js
|
||||
rm -f build/test_rendering_requires.js
|
||||
rm -rf build/examples
|
||||
rm -rf build/compiled-examples
|
||||
rm -rf $(BUILD_HOSTED)
|
||||
|
||||
.PHONY: cleanall
|
||||
cleanall:
|
||||
rm -rf build
|
||||
|
||||
.PHONY: css
|
||||
css: build/ol.css
|
||||
|
||||
.PHONY: examples
|
||||
examples: $(BUILD_EXAMPLES)
|
||||
|
||||
.PHONY: install
|
||||
install: build/timestamps/node-modules-timestamp
|
||||
|
||||
.PHONY: lint
|
||||
lint: build/timestamps/gjslint-timestamp build/timestamps/jshint-timestamp \
|
||||
build/timestamps/check-requires-timestamp \
|
||||
build/timestamps/check-whitespace-timestamp
|
||||
|
||||
.PHONY: npm-install
|
||||
npm-install: build/timestamps/node-modules-timestamp
|
||||
|
||||
.PHONY: shaders
|
||||
shaders: $(SRC_SHADER_JS)
|
||||
|
||||
.PHONY: serve
|
||||
serve: build/test_requires.js build/test_rendering_requires.js
|
||||
node tasks/serve.js
|
||||
|
||||
.PHONY: test
|
||||
test: build/timestamps/node-modules-timestamp build/test_requires.js
|
||||
node tasks/test.js
|
||||
|
||||
.PHONY: test-coverage
|
||||
test-coverage: build/timestamps/node-modules-timestamp
|
||||
node tasks/test-coverage.js
|
||||
|
||||
.PHONY: test-rendering
|
||||
test-rendering: build/timestamps/node-modules-timestamp \
|
||||
build/test_rendering_requires.js
|
||||
@rm -rf build/slimerjs-profile
|
||||
@mkdir -p build/slimerjs-profile
|
||||
@cp -r test_rendering/slimerjs-profile/* build/slimerjs-profile/
|
||||
node tasks/test-rendering.js
|
||||
|
||||
.PHONY: host-examples
|
||||
host-examples: $(BUILD_HOSTED_EXAMPLES) \
|
||||
$(BUILD_HOSTED)/build/ol.js \
|
||||
$(BUILD_HOSTED)/build/ol-debug.js \
|
||||
$(BUILD_HOSTED)/css/ol.css \
|
||||
$(BUILD_HOSTED)/examples/loader.js \
|
||||
$(BUILD_HOSTED)/build/ol-deps.js
|
||||
|
||||
.PHONY: host-libraries
|
||||
host-libraries: build/timestamps/node-modules-timestamp
|
||||
@rm -rf $(BUILD_HOSTED)/closure-library
|
||||
@mkdir -p $(BUILD_HOSTED)/closure-library
|
||||
@cp -r $(CLOSURE_LIB)/* $(BUILD_HOSTED)/closure-library/
|
||||
@rm -rf $(BUILD_HOSTED)/ol/ol
|
||||
@mkdir -p $(BUILD_HOSTED)/ol/ol
|
||||
@cp -r src/ol/* $(BUILD_HOSTED)/ol/ol/
|
||||
@rm -rf $(BUILD_HOSTED)/ol.ext
|
||||
@mkdir -p $(BUILD_HOSTED)/ol.ext
|
||||
@cp -r build/ol.ext/* $(BUILD_HOSTED)/ol.ext/
|
||||
|
||||
$(BUILD_EXAMPLES): $(EXAMPLES)
|
||||
@mkdir -p $(@D)
|
||||
@node tasks/build-examples.js
|
||||
|
||||
build/timestamps/check-%-timestamp: $(BUILD_HOSTED)/examples/%.html \
|
||||
$(BUILD_HOSTED)/examples/%.js \
|
||||
$(filter $(BUILD_HOSTED)/examples/resources/%,$(BUILD_HOSTED_EXAMPLES)) \
|
||||
$(filter $(BUILD_HOSTED)/examples/data/%,$(BUILD_HOSTED_EXAMPLES)) \
|
||||
$(BUILD_HOSTED)/examples/loader.js \
|
||||
$(BUILD_HOSTED)/build/ol.js \
|
||||
$(BUILD_HOSTED)/css/ol.css
|
||||
@mkdir -p $(@D)
|
||||
./node_modules/.bin/phantomjs --ssl-protocol=any --ignore-ssl-errors=true bin/check-example.js $(addsuffix ?mode=advanced, $<)
|
||||
@touch $@
|
||||
|
||||
build/timestamps/check-requires-timestamp: $(SRC_JS) $(EXAMPLES_JS) \
|
||||
$(SRC_SHADER_JS) $(SPEC_JS) \
|
||||
$(SPEC_RENDERING JS)
|
||||
@mkdir -p $(@D)
|
||||
@python bin/check-requires.py $(CLOSURE_LIB) $^
|
||||
@touch $@
|
||||
|
||||
build/timestamps/check-whitespace-timestamp: $(SRC_JS) $(EXAMPLES_JS) \
|
||||
$(SPEC_JS) $(SPEC_RENDERING JS) \
|
||||
$(SRC_JSDOC)
|
||||
@mkdir -p $(@D)
|
||||
@python bin/check-whitespace.py $^
|
||||
@touch $@
|
||||
|
||||
build/compiled-examples/all.js: $(EXAMPLES_JS)
|
||||
@mkdir -p $(@D)
|
||||
@python bin/combine-examples.py $^ > $@
|
||||
|
||||
build/compiled-examples/all.combined.js: config/examples-all.json build/compiled-examples/all.js \
|
||||
$(SRC_JS) $(SRC_SHADER_JS) \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
node tasks/build.js $< $@
|
||||
|
||||
build/compiled-examples/%.json: config/example.json build/examples/%.js \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
@sed -e 's|{{id}}|$*|' $< > $@
|
||||
|
||||
build/compiled-examples/%.combined.js: build/compiled-examples/%.json \
|
||||
$(SRC_JS) $(SRC_SHADER_JS) \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
node tasks/build.js $< $@
|
||||
|
||||
build/timestamps/jsdoc-$(BRANCH)-timestamp: config/jsdoc/api/index.md \
|
||||
config/jsdoc/api/conf.json $(SRC_JS) \
|
||||
$(SRC_SHADER_JS) \
|
||||
$(shell find config/jsdoc/api/template -type f) \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
@rm -rf $(BUILD_HOSTED)/apidoc
|
||||
./node_modules/.bin/jsdoc config/jsdoc/api/index.md -c config/jsdoc/api/conf.json -d $(BUILD_HOSTED)/apidoc
|
||||
@touch $@
|
||||
|
||||
build/timestamps/gjslint-timestamp: $(SRC_JS) $(SPEC_JS) $(SPEC_RENDERING_JS) \
|
||||
$(EXAMPLES_JS)
|
||||
@mkdir -p $(@D)
|
||||
@echo "Running gjslint..."
|
||||
@gjslint --jslint_error=all --custom_jsdoc_tags=event,fires,function,classdesc,api,observable --strict $?
|
||||
@touch $@
|
||||
|
||||
$(BUILD_HOSTED_EXAMPLES_JS): $(BUILD_HOSTED)/examples/%.js: build/examples/%.js
|
||||
@mkdir -p $(@D)
|
||||
@python bin/split-example.py $< $(@D)
|
||||
|
||||
$(BUILD_HOSTED)/examples/loader.js: bin/loader_hosted_examples.js
|
||||
@mkdir -p $(@D)
|
||||
@cp $< $@
|
||||
|
||||
$(BUILD_HOSTED)/examples/%: build/examples/%
|
||||
@mkdir -p $(@D)
|
||||
@cp $< $@
|
||||
|
||||
$(BUILD_HOSTED)/build/ol.js: build/ol.js
|
||||
@mkdir -p $(@D)
|
||||
@cp $< $@
|
||||
|
||||
$(BUILD_HOSTED)/build/ol-debug.js: build/ol-debug.js
|
||||
@mkdir -p $(@D)
|
||||
@cp $< $@
|
||||
|
||||
$(BUILD_HOSTED)/css/ol.css: build/ol.css
|
||||
@mkdir -p $(@D)
|
||||
@cp $< $@
|
||||
|
||||
$(BUILD_HOSTED)/build/ol-deps.js: host-libraries
|
||||
@mkdir -p $(@D)
|
||||
@python $(CLOSURE_LIB)/closure/bin/build/depswriter.py \
|
||||
--root_with_prefix "src ../../../ol" \
|
||||
--root_with_prefix "build/ol.ext ../../../ol.ext" \
|
||||
--root $(BUILD_HOSTED)/closure-library/closure/goog \
|
||||
--root_with_prefix "$(BUILD_HOSTED)/closure-library/third_party ../../third_party" \
|
||||
--output_file $@
|
||||
|
||||
build/timestamps/jshint-timestamp: $(SRC_JS) $(SPEC_JS) $(SPEC_RENDERING_JS) \
|
||||
$(TASKS_JS) $(EXAMPLES_JS) \
|
||||
examples/resources/common.js \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
@echo "Running jshint..."
|
||||
@./node_modules/.bin/jshint --verbose $?
|
||||
@touch $@
|
||||
|
||||
build/timestamps/node-modules-timestamp: package.json
|
||||
@mkdir -p $(@D)
|
||||
npm install
|
||||
@touch $@
|
||||
|
||||
build/ol.css: css/ol.css build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
@echo "Running cleancss..."
|
||||
@./node_modules/.bin/cleancss $< > $@
|
||||
|
||||
build/ol.js: config/ol.json $(SRC_JS) $(SRC_SHADER_JS) \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
node tasks/build.js $< $@
|
||||
@$(STAT_UNCOMPRESSED) $@
|
||||
@cp $@ /tmp/
|
||||
@gzip /tmp/ol.js
|
||||
@$(STAT_COMPRESSED) /tmp/ol.js.gz
|
||||
@rm /tmp/ol.js.gz
|
||||
|
||||
build/ol.js.map: config/ol.json $(SRC_JS) $(SRC_SHADER_JS) \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
node tasks/build.js $< $@
|
||||
|
||||
build/ol-debug.js: config/ol-debug.json $(SRC_JS) $(SRC_SHADER_JS) \
|
||||
build/timestamps/node-modules-timestamp
|
||||
@mkdir -p $(@D)
|
||||
node tasks/build.js $< $@
|
||||
@$(STAT_UNCOMPRESSED) $@
|
||||
@cp $@ /tmp/
|
||||
@gzip /tmp/ol-debug.js
|
||||
@$(STAT_COMPRESSED) /tmp/ol-debug.js.gz
|
||||
@rm /tmp/ol-debug.js.gz
|
||||
|
||||
build/test_requires.js: $(SPEC_JS) $(SRC_JS)
|
||||
@mkdir -p $(@D)
|
||||
@node tasks/generate-requires.js $^ > $@
|
||||
|
||||
build/test_rendering_requires.js: $(SPEC_RENDERING_JS)
|
||||
@mkdir -p $(@D)
|
||||
@node tasks/generate-requires.js $^ > $@
|
||||
|
||||
%shader.js: %.glsl src/ol/webgl/shader.mustache bin/pyglslunit.py
|
||||
@python bin/pyglslunit.py --input $< --template src/ol/webgl/shader.mustache --output $@
|
||||
190
bin/check-requires.py
Normal file
190
bin/check-requires.py
Normal file
@@ -0,0 +1,190 @@
|
||||
import os
|
||||
import logging
|
||||
import re
|
||||
import sys
|
||||
|
||||
logging.basicConfig(format='%(asctime)s %(name)s: %(message)s',
|
||||
level=logging.INFO)
|
||||
|
||||
logger = logging.getLogger('check-requires')
|
||||
|
||||
|
||||
class Node(object):
|
||||
|
||||
def __init__(self):
|
||||
self.present = False
|
||||
self.children = {}
|
||||
|
||||
def _build_re(self, key):
|
||||
if key == '*':
|
||||
assert len(self.children) == 0
|
||||
# We want to match `.doIt` but not `.SomeClass` or `.more.stuff`
|
||||
return '(?=\\.[a-z]\\w*\\b(?!\\.))'
|
||||
elif len(self.children) == 1:
|
||||
child_key, child = next(self.children.iteritems())
|
||||
child_re = child._build_re(child_key)
|
||||
if child_key != '*':
|
||||
child_re = '\\.' + child_re
|
||||
if self.present:
|
||||
return key + '(' + child_re + ')?'
|
||||
else:
|
||||
return key + child_re
|
||||
elif self.children:
|
||||
children_re = '(?:' + '|'.join(
|
||||
('\\.' if k != '*' else '') + self.children[k]._build_re(k)
|
||||
for k in sorted(self.children.keys())) + ')'
|
||||
if self.present:
|
||||
return key + children_re + '?'
|
||||
else:
|
||||
return key + children_re
|
||||
else:
|
||||
assert self.present
|
||||
return key
|
||||
|
||||
def build_re(self, key):
|
||||
return re.compile('\\b' + self._build_re(key) + '\\b')
|
||||
|
||||
|
||||
def ifind(*paths):
|
||||
"""ifind is an iterative version of os.walk, yielding all walked paths and
|
||||
normalizing paths to use forward slashes."""
|
||||
for path in paths:
|
||||
for dirpath, dirnames, names in os.walk(path):
|
||||
for name in names:
|
||||
if os.sep == '/':
|
||||
yield os.path.join(dirpath, name)
|
||||
else:
|
||||
yield '/'.join(dirpath.split(os.sep) + [name])
|
||||
|
||||
|
||||
def _strip_comments(lines):
|
||||
# FIXME this is a horribe hack, we should use a proper JavaScript parser
|
||||
# here
|
||||
in_multiline_comment = False
|
||||
lineno = 0
|
||||
for line in lines:
|
||||
lineno += 1
|
||||
if in_multiline_comment:
|
||||
index = line.find('*/')
|
||||
if index != -1:
|
||||
in_multiline_comment = False
|
||||
line = line[index + 2:]
|
||||
if not in_multiline_comment:
|
||||
line = re.sub(r'//[^\n]*', '', line)
|
||||
line = re.sub(r'/\*.*?\*/', '', line)
|
||||
index = line.find('/*')
|
||||
if index != -1:
|
||||
yield lineno, line[:index]
|
||||
in_multiline_comment = True
|
||||
else:
|
||||
yield lineno, line
|
||||
|
||||
|
||||
def check_requires(closure_lib, *filenames):
|
||||
unused_count = 0
|
||||
all_provides = set()
|
||||
|
||||
for filename in ifind(closure_lib):
|
||||
if filename.endswith('.js'):
|
||||
if not re.match(r'.*/closure/goog/', filename):
|
||||
continue
|
||||
# Skip goog.i18n because it contains so many modules that it causes
|
||||
# the generated regular expression to exceed Python's limits
|
||||
if re.match(r'.*/closure/goog/i18n/', filename):
|
||||
continue
|
||||
for line in open(filename, 'rU'):
|
||||
m = re.match(r'goog.provide\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
all_provides.add(m.group(1))
|
||||
|
||||
for filename in sorted(filenames):
|
||||
require_linenos = {}
|
||||
uses = set()
|
||||
lines = open(filename, 'rU').readlines()
|
||||
for lineno, line in _strip_comments(lines):
|
||||
m = re.match(r'goog.provide\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
all_provides.add(m.group(1))
|
||||
continue
|
||||
m = re.match(r'goog.require\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
require_linenos[m.group(1)] = lineno
|
||||
continue
|
||||
ignore_linenos = require_linenos.values()
|
||||
for lineno, line in enumerate(lines):
|
||||
if lineno in ignore_linenos:
|
||||
continue
|
||||
for require in require_linenos.iterkeys():
|
||||
if require in line:
|
||||
uses.add(require)
|
||||
for require in sorted(set(require_linenos.keys()) - uses):
|
||||
logger.info('%s:%d: unused goog.require: %r' % (
|
||||
filename, require_linenos[require], require))
|
||||
unused_count += 1
|
||||
|
||||
all_provides.discard('ol')
|
||||
all_provides.discard('ol.MapProperty')
|
||||
|
||||
root = Node()
|
||||
for provide in all_provides:
|
||||
node = root
|
||||
for component in provide.split('.'):
|
||||
if component not in node.children:
|
||||
node.children[component] = Node()
|
||||
node = node.children[component]
|
||||
if component[0].islower():
|
||||
# We've arrived at a namespace provide like `ol.foo`.
|
||||
# In this case, we want to match uses like `ol.foo.doIt()` but
|
||||
# not match things like `new ol.foo.SomeClass()`.
|
||||
# For this purpose, we use the special wildcard key for the child.
|
||||
node.children['*'] = Node()
|
||||
else:
|
||||
node.present = True
|
||||
provide_res = [child.build_re(key)
|
||||
for key, child in root.children.iteritems()]
|
||||
missing_count = 0
|
||||
for filename in sorted(filenames):
|
||||
provides = set()
|
||||
requires = set()
|
||||
uses = set()
|
||||
uses_linenos = {}
|
||||
for lineno, line in _strip_comments(open(filename, 'rU')):
|
||||
m = re.match(r'goog.provide\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
provides.add(m.group(1))
|
||||
continue
|
||||
m = re.match(r'goog.require\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
requires.add(m.group(1))
|
||||
continue
|
||||
while True:
|
||||
for provide_re in provide_res:
|
||||
m = provide_re.search(line)
|
||||
if m:
|
||||
uses.add(m.group())
|
||||
uses_linenos[m.group()] = lineno
|
||||
line = line[:m.start()] + line[m.end():]
|
||||
break
|
||||
else:
|
||||
break
|
||||
if filename == 'src/ol/renderer/layerrenderer.js':
|
||||
uses.discard('ol.renderer.Map')
|
||||
m = re.match(
|
||||
r'src/ol/renderer/(\w+)/\1(\w*)layerrenderer\.js\Z', filename)
|
||||
if m:
|
||||
uses.discard('ol.renderer.Map')
|
||||
uses.discard('ol.renderer.%s.Map' % (m.group(1),))
|
||||
missing_requires = uses - requires - provides
|
||||
if missing_requires:
|
||||
for missing_require in sorted(missing_requires):
|
||||
logger.info("%s:%d missing goog.require('%s')" %
|
||||
(filename, uses_linenos[missing_require],
|
||||
missing_require))
|
||||
missing_count += 1
|
||||
if unused_count or missing_count:
|
||||
logger.error('%d unused goog.requires, %d missing goog.requires' %
|
||||
(unused_count, missing_count))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
check_requires(*sys.argv[1:])
|
||||
44
bin/check-whitespace.py
Normal file
44
bin/check-whitespace.py
Normal file
@@ -0,0 +1,44 @@
|
||||
import logging
|
||||
import re
|
||||
import sys
|
||||
|
||||
logging.basicConfig(format='%(asctime)s %(name)s: %(message)s',
|
||||
level=logging.INFO)
|
||||
|
||||
logger = logging.getLogger('check-whitespace')
|
||||
|
||||
CR_RE = re.compile(r'\r')
|
||||
LEADING_WHITESPACE_RE = re.compile(r'\s+')
|
||||
TRAILING_WHITESPACE_RE = re.compile(r'\s+\n\Z')
|
||||
NO_NEWLINE_RE = re.compile(r'[^\n]\Z')
|
||||
ALL_WHITESPACE_RE = re.compile(r'\s+\Z')
|
||||
|
||||
|
||||
def check_whitespace(*filenames):
|
||||
errors = 0
|
||||
for filename in sorted(filenames):
|
||||
whitespace = False
|
||||
for lineno, line in enumerate(open(filename, 'rU')):
|
||||
if lineno == 0 and LEADING_WHITESPACE_RE.match(line):
|
||||
logger.info('%s:%d: leading whitespace', filename, lineno + 1)
|
||||
errors += 1
|
||||
if CR_RE.search(line):
|
||||
logger.info('%s:%d: carriage return character in line',
|
||||
filename, lineno + 1)
|
||||
errors += 1
|
||||
if TRAILING_WHITESPACE_RE.search(line):
|
||||
logger.info('%s:%d: trailing whitespace', filename, lineno + 1)
|
||||
errors += 1
|
||||
if NO_NEWLINE_RE.search(line):
|
||||
logger.info('%s:%d: no newline at end of file', filename,
|
||||
lineno + 1)
|
||||
errors += 1
|
||||
whitespace = ALL_WHITESPACE_RE.match(line)
|
||||
if whitespace:
|
||||
logger.info('%s: trailing whitespace at end of file', filename)
|
||||
errors += 1
|
||||
if errors:
|
||||
logger.error('%d whitespace errors' % (errors,))
|
||||
|
||||
if __name__ == "__main__":
|
||||
check_whitespace(*sys.argv[1:])
|
||||
@@ -6,8 +6,8 @@
|
||||
* loads Closure Library's base.js, ol-deps.js, the example's "goog.require"
|
||||
* script, and the example's script in "development" mode.
|
||||
*
|
||||
* The ol.js and ol-deps.js scripts are built by OL3's build.py script.
|
||||
* They are located in the ../build/ directory, relative to this script.
|
||||
* The ol.js and ol-deps.js scripts are built using ol3's Makefile. They are
|
||||
* located in the ../build/ directory, relative to this script.
|
||||
*
|
||||
* The script must be named loader.js.
|
||||
*
|
||||
|
||||
39
bin/split-example.py
Normal file
39
bin/split-example.py
Normal file
@@ -0,0 +1,39 @@
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
|
||||
def split_example_file(example, dst_dir):
|
||||
lines = open(example, 'rU').readlines()
|
||||
|
||||
target_lines = []
|
||||
target_require_lines = []
|
||||
|
||||
found_requires = False
|
||||
found_code = False
|
||||
for line in lines:
|
||||
m = re.match(r'goog.require\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
found_requires = True
|
||||
target_require_lines.append(line)
|
||||
elif found_requires:
|
||||
if found_code or line not in ('\n', '\r\n'):
|
||||
found_code = True
|
||||
target_lines.append(line)
|
||||
|
||||
target = open(
|
||||
os.path.join(dst_dir, os.path.basename(example)), 'wb')
|
||||
target_require = open(
|
||||
os.path.join(dst_dir, os.path.basename(example)
|
||||
.replace('.js', '-require.js')),
|
||||
'wb')
|
||||
|
||||
target.writelines(target_lines)
|
||||
target.close()
|
||||
|
||||
target_require.writelines(target_require_lines)
|
||||
target_require.close()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
split_example_file(*sys.argv[1:])
|
||||
831
build.py
831
build.py
@@ -1,831 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
from cStringIO import StringIO
|
||||
import glob
|
||||
import gzip
|
||||
import json
|
||||
import multiprocessing
|
||||
import os
|
||||
import re
|
||||
import signal
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
from pake import Target
|
||||
from pake import ifind, main, output, rule, target, variables, virtual, which
|
||||
from Queue import Queue
|
||||
from threading import Thread
|
||||
|
||||
def sigint_handler(signal, frame):
|
||||
print('Exiting')
|
||||
sys.exit(0)
|
||||
|
||||
class ThreadPool:
|
||||
"""A basic pool of worker threads"""
|
||||
class Worker(Thread):
|
||||
def __init__(self, tasks):
|
||||
Thread.__init__(self)
|
||||
self.tasks = tasks
|
||||
self.daemon = True # threads will be killed on exit
|
||||
self.start()
|
||||
|
||||
def run(self):
|
||||
while True:
|
||||
# block until a task is ready to be done
|
||||
function, args, kargs = self.tasks.get()
|
||||
try:
|
||||
function(*args, **kargs)
|
||||
except:
|
||||
print("ERROR")
|
||||
for count, thing in enumerate(args):
|
||||
print '{0}. {1}'.format(count, thing)
|
||||
print(sys.exc_info()[0])
|
||||
print("ERROR")
|
||||
self.tasks.errors = True
|
||||
self.tasks.task_done()
|
||||
|
||||
def __init__(self, num_threads = multiprocessing.cpu_count() + 1):
|
||||
self.tasks = Queue(num_threads)
|
||||
self.tasks.errors = False
|
||||
# create num_threads Workers, by default the number of CPUs + 1
|
||||
for _ in range(num_threads): self.Worker(self.tasks)
|
||||
|
||||
def add_task(self, function, *args, **kargs):
|
||||
self.tasks.put((function, args, kargs))
|
||||
|
||||
def wait_completion(self):
|
||||
# wait for the queue to be empty
|
||||
self.tasks.join()
|
||||
return self.tasks.errors
|
||||
|
||||
|
||||
if sys.platform == 'win32':
|
||||
|
||||
win = {
|
||||
'CLEANCSS': './node_modules/.bin/cleancss',
|
||||
'GIT': 'git.exe',
|
||||
'GJSLINT': 'gjslint.exe',
|
||||
'JSDOC': './node_modules/.bin/jsdoc',
|
||||
'JSHINT': './node_modules/.bin/jshint',
|
||||
'PYTHON': 'python.exe',
|
||||
'PHANTOMJS': './node_modules/.bin/phantomjs'
|
||||
}
|
||||
|
||||
sys_dir = os.environ.get('SYSTEMDRIVE')
|
||||
program_files = os.environ.get('PROGRAMFILES')
|
||||
|
||||
if not which(win['GIT']):
|
||||
win['GIT'] = os.path.join(program_files, 'Git', 'cmd', 'git.exe')
|
||||
if not which(win['GIT']):
|
||||
win['GIT'] = os.path.join(program_files, 'Git', 'bin', 'git.exe')
|
||||
|
||||
if not which(win['PYTHON']):
|
||||
win['PYTHON'] = os.path.join(sys_dir, 'Python27', 'python.exe')
|
||||
|
||||
if not which(win['GJSLINT']):
|
||||
win['GJSLINT'] = os.path.join(sys_dir, 'Python27', 'Scripts', 'gjslint.exe')
|
||||
|
||||
if not which(win['PHANTOMJS']):
|
||||
win['PHANTOMJS'] = 'phantomjs.exe'
|
||||
if not which(win['PHANTOMJS']):
|
||||
win['PHANTOMJS'] = os.path.join(sys_dir, 'phantomjs-1.9.7-windows', 'phantomjs.exe')
|
||||
|
||||
if not which(win['JSDOC']):
|
||||
win['JSDOC'] = os.path.join(program_files, 'jsdoc3', 'jsdoc.cmd')
|
||||
|
||||
for program, path in win.iteritems():
|
||||
setattr(variables, program, path)
|
||||
|
||||
else:
|
||||
variables.CLEANCSS = './node_modules/.bin/cleancss'
|
||||
variables.GIT = 'git'
|
||||
variables.GJSLINT = 'gjslint'
|
||||
variables.JSHINT = './node_modules/.bin/jshint'
|
||||
variables.JSDOC = './node_modules/.bin/jsdoc'
|
||||
variables.PYTHON = 'python'
|
||||
variables.PHANTOMJS = './node_modules/.bin/phantomjs'
|
||||
|
||||
variables.BRANCH = output(
|
||||
'%(GIT)s', 'rev-parse', '--abbrev-ref', 'HEAD').strip()
|
||||
|
||||
EXECUTABLES = [variables.CLEANCSS, variables.GIT, variables.GJSLINT,
|
||||
variables.JSDOC, variables.JSHINT, variables.PYTHON,
|
||||
variables.PHANTOMJS]
|
||||
|
||||
EXAMPLES_SRC_ALL = [path for path in ifind('examples')]
|
||||
|
||||
EXAMPLES_SRC_HTML = [path
|
||||
for path in EXAMPLES_SRC_ALL
|
||||
if path.endswith('.html')
|
||||
if path != 'examples/index.html']
|
||||
|
||||
EXAMPLES_SRC_JS = [example.replace('.html', '.js')
|
||||
for example in EXAMPLES_SRC_HTML]
|
||||
|
||||
EXAMPLES_DEST_ALL = [path.replace('examples', 'build/examples')
|
||||
for path in EXAMPLES_SRC_ALL]
|
||||
|
||||
GLSL_SRC = [path
|
||||
for path in ifind('src')
|
||||
if path.endswith('.glsl')]
|
||||
|
||||
JSDOC_SRC = [path
|
||||
for path in ifind('src')
|
||||
if path.endswith('.jsdoc')]
|
||||
|
||||
SHADER_SRC = [path.replace('.glsl', 'shader.js')
|
||||
for path in GLSL_SRC]
|
||||
|
||||
SPEC = [path
|
||||
for path in ifind('test/spec')
|
||||
if path.endswith('.js')]
|
||||
|
||||
SPEC_RENDERING = [path
|
||||
for path in ifind('test_rendering/spec')
|
||||
if path.endswith('.js')]
|
||||
|
||||
TASKS = [path
|
||||
for path in ifind('tasks')
|
||||
if path.endswith('.js')]
|
||||
|
||||
SRC = [path
|
||||
for path in ifind('src/ol')
|
||||
if path.endswith('.js')
|
||||
if path not in SHADER_SRC]
|
||||
|
||||
NPM_INSTALL = 'build/npm-install-timestamp'
|
||||
|
||||
def report_sizes(t):
|
||||
stringio = StringIO()
|
||||
gzipfile = gzip.GzipFile(t.name, 'w', 9, stringio)
|
||||
with open(t.name, 'rb') as f:
|
||||
shutil.copyfileobj(f, gzipfile)
|
||||
gzipfile.close()
|
||||
rawsize = os.stat(t.name).st_size
|
||||
gzipsize = len(stringio.getvalue())
|
||||
savings = '{0:.2%}'.format((rawsize - gzipsize)/float(rawsize))
|
||||
t.info('uncompressed: %8d bytes', rawsize)
|
||||
t.info(' compressed: %8d bytes, (saved %s)', gzipsize, savings)
|
||||
|
||||
|
||||
virtual('default', 'build')
|
||||
|
||||
|
||||
virtual('ci', 'lint', 'build', 'test', 'test-rendering',
|
||||
'build/compiled-examples/all.combined.js', 'check-examples', 'apidoc')
|
||||
|
||||
|
||||
virtual('build', 'build/ol.css', 'build/ol.js', 'build/ol-debug.js',
|
||||
'build/ol.js.map')
|
||||
|
||||
|
||||
virtual('check', 'lint', 'build/ol.js', 'test')
|
||||
|
||||
|
||||
virtual('todo', 'fixme')
|
||||
|
||||
|
||||
@target(NPM_INSTALL, 'package.json')
|
||||
def npm_install(t):
|
||||
t.run('npm', 'install')
|
||||
t.touch()
|
||||
|
||||
|
||||
@target('build/ol.css', 'css/ol.css', NPM_INSTALL)
|
||||
def build_ol_css(t):
|
||||
t.output('%(CLEANCSS)s', 'css/ol.css')
|
||||
|
||||
|
||||
def _build_js(t):
|
||||
t.run('node', 'tasks/build.js', 'config/ol.json', 'build/ol.js')
|
||||
|
||||
|
||||
@target('build/ol.js', SRC, SHADER_SRC, 'config/ol.json', NPM_INSTALL)
|
||||
def build_ol_js(t):
|
||||
_build_js(t)
|
||||
report_sizes(t)
|
||||
|
||||
|
||||
@target('build/ol.js.map', SRC, SHADER_SRC, 'config/ol.json', NPM_INSTALL)
|
||||
def build_ol_js_map(t):
|
||||
_build_js(t)
|
||||
|
||||
|
||||
@target('build/ol-debug.js', SRC, SHADER_SRC, 'config/ol-debug.json',
|
||||
NPM_INSTALL)
|
||||
def build_ol_debug_js(t):
|
||||
t.run('node', 'tasks/build.js', 'config/ol-debug.json', 'build/ol-debug.js')
|
||||
report_sizes(t)
|
||||
|
||||
|
||||
for glsl_src in GLSL_SRC:
|
||||
def shader_src_helper(glsl_src):
|
||||
@target(glsl_src.replace('.glsl', 'shader.js'), glsl_src,
|
||||
'src/ol/webgl/shader.mustache', 'bin/pyglslunit.py')
|
||||
def shader_src(t):
|
||||
t.run('%(PYTHON)s', 'bin/pyglslunit.py',
|
||||
'--input', glsl_src,
|
||||
'--template', 'src/ol/webgl/shader.mustache',
|
||||
'--output', t.name)
|
||||
shader_src_helper(glsl_src)
|
||||
|
||||
|
||||
def build_requires(task):
|
||||
requires = set()
|
||||
for dependency in task.dependencies:
|
||||
for line in open(dependency, 'rU'):
|
||||
match = re.match(r'goog\.provide\(\'(.*)\'\);', line)
|
||||
if match:
|
||||
requires.add(match.group(1))
|
||||
with open(task.name, 'wb') as f:
|
||||
for require in sorted(requires):
|
||||
f.write('goog.require(\'%s\');\n' % (require,))
|
||||
|
||||
|
||||
@target('build/test_requires.js', SPEC)
|
||||
def build_test_requires(t):
|
||||
build_requires(t)
|
||||
|
||||
|
||||
@target('build/test_rendering_requires.js', SPEC_RENDERING)
|
||||
def build_test_rendering_requires(t):
|
||||
build_requires(t)
|
||||
|
||||
|
||||
virtual('examples', EXAMPLES_DEST_ALL)
|
||||
|
||||
|
||||
@rule(r'\Abuild\/examples/(?P<filepath>.*)\Z')
|
||||
def examples_dest(name, match):
|
||||
def action(t):
|
||||
t.run('node', 'tasks/build-examples.js')
|
||||
dependencies = ['examples/%(filepath)s' % match.groupdict()]
|
||||
return Target(name, action=action, dependencies=dependencies)
|
||||
|
||||
|
||||
@target('build/compiled-examples/all.combined.js', 'build/compiled-examples/all.js',
|
||||
SRC, SHADER_SRC, 'config/examples-all.json', NPM_INSTALL)
|
||||
def build_examples_all_combined_js(t):
|
||||
t.run('node', 'tasks/build.js', 'config/examples-all.json',
|
||||
'build/compiled-examples/all.combined.js')
|
||||
report_sizes(t)
|
||||
|
||||
|
||||
@target('build/compiled-examples/all.js', EXAMPLES_SRC_JS)
|
||||
def build_examples_all_js(t):
|
||||
t.output('%(PYTHON)s', 'bin/combine-examples.py', t.dependencies)
|
||||
|
||||
|
||||
@rule(r'\Abuild/compiled-examples/(?P<id>.*).json\Z')
|
||||
def examples_star_json(name, match):
|
||||
def action(t):
|
||||
|
||||
# When compiling the ol3 code and the application code together it is
|
||||
# better to use oli.js and olx.js files as "input" files rather than
|
||||
# "externs" files. Indeed, externs prevent renaming, which is neither
|
||||
# necessary nor desirable in this case.
|
||||
#
|
||||
# oli.js and olx.js do not provide or require namespaces (using
|
||||
# "goog.provide" or "goog.require"). For that reason, if they are
|
||||
# specified as input files through the "src" property, then
|
||||
# closure-util will exclude them when creating the dependencies graph.
|
||||
# So the compile "js" property is used instead. With that property the
|
||||
# oli.js and olx.js files are passed directly to the compiler. And by
|
||||
# setting "manage_closure_dependencies" to "true" the compiler will not
|
||||
# exclude them from its dependencies graph.
|
||||
|
||||
content = json.dumps({
|
||||
"exports": [],
|
||||
"src": [
|
||||
"src/**/*.js",
|
||||
"build/ol.ext/*.js",
|
||||
"build/examples/%(id)s.js" % match.groupdict()],
|
||||
"compile": {
|
||||
"js": [
|
||||
"externs/olx.js",
|
||||
"externs/oli.js",
|
||||
],
|
||||
"externs": [
|
||||
"externs/bingmaps.js",
|
||||
"externs/bootstrap.js",
|
||||
"externs/closure-compiler.js",
|
||||
"externs/esrijson.js",
|
||||
"externs/example.js",
|
||||
"externs/fastclick.js",
|
||||
"externs/geojson.js",
|
||||
"externs/jquery-1.9.js",
|
||||
"externs/proj4js.js",
|
||||
"externs/tilejson.js",
|
||||
"externs/topojson.js"
|
||||
],
|
||||
"define": [
|
||||
"goog.array.ASSUME_NATIVE_FUNCTIONS=true",
|
||||
"goog.dom.ASSUME_STANDARDS_MODE=true",
|
||||
"goog.json.USE_NATIVE_JSON=true",
|
||||
"goog.DEBUG=false"
|
||||
],
|
||||
"jscomp_error": [
|
||||
"accessControls",
|
||||
"ambiguousFunctionDecl",
|
||||
"checkDebuggerStatement",
|
||||
"checkEventfulObjectDisposal",
|
||||
"checkRegExp",
|
||||
"checkStructDictInheritance",
|
||||
"checkTypes",
|
||||
"checkVars",
|
||||
"const",
|
||||
"constantProperty",
|
||||
"deprecated",
|
||||
"duplicate",
|
||||
"duplicateMessage",
|
||||
"es3",
|
||||
"es5Strict",
|
||||
"externsValidation",
|
||||
"fileoverviewTags",
|
||||
"globalThis",
|
||||
"internetExplorerChecks",
|
||||
"invalidCasts",
|
||||
"misplacedTypeAnnotation",
|
||||
"missingProperties",
|
||||
"missingProvide",
|
||||
"missingRequire",
|
||||
"missingReturn",
|
||||
"newCheckTypes",
|
||||
"nonStandardJsDocs",
|
||||
"strictModuleDepCheck",
|
||||
"suspiciousCode",
|
||||
"typeInvalidation",
|
||||
"tweakValidation",
|
||||
"undefinedNames",
|
||||
"undefinedVars",
|
||||
"uselessCode",
|
||||
"violatedModuleDep",
|
||||
"visibility"
|
||||
],
|
||||
"jscomp_off": [
|
||||
"unknownDefines"
|
||||
],
|
||||
"extra_annotation_name": [
|
||||
"api", "observable"
|
||||
],
|
||||
"compilation_level": "ADVANCED",
|
||||
"warning_level": "VERBOSE",
|
||||
"output_wrapper": "(function(){%output%})();",
|
||||
"use_types_for_optimization": True,
|
||||
"manage_closure_dependencies": True
|
||||
}
|
||||
})
|
||||
with open(t.name, 'wb') as f:
|
||||
f.write(content)
|
||||
return Target(name, action=action,
|
||||
dependencies=[__file__, NPM_INSTALL])
|
||||
|
||||
|
||||
@rule(r'\Abuild/compiled-examples/(?P<id>.*).combined.js\Z')
|
||||
def examples_star_combined_js(name, match):
|
||||
def action(t):
|
||||
config = 'build/compiled-examples/%(id)s.json' % match.groupdict()
|
||||
t.run('node', 'tasks/build.js', config, name)
|
||||
report_sizes(t)
|
||||
dependencies = [SRC, SHADER_SRC,
|
||||
'examples/%(id)s.js' % match.groupdict(),
|
||||
'build/compiled-examples/%(id)s.json' % match.groupdict(),
|
||||
NPM_INSTALL]
|
||||
return Target(name, action=action, dependencies=dependencies)
|
||||
|
||||
|
||||
@target('serve', 'build/test_requires.js', 'build/test_rendering_requires.js',
|
||||
NPM_INSTALL)
|
||||
def serve(t):
|
||||
t.run('node', 'tasks/serve.js')
|
||||
|
||||
|
||||
virtual('lint', 'build/lint-timestamp', 'build/check-requires-timestamp',
|
||||
'build/check-whitespace-timestamp', 'jshint')
|
||||
|
||||
|
||||
@target('build/lint-timestamp', SRC, EXAMPLES_SRC_JS, SPEC, SPEC_RENDERING,
|
||||
precious=True)
|
||||
def build_lint_src_timestamp(t):
|
||||
t.run('%(GJSLINT)s',
|
||||
'--jslint_error=all',
|
||||
'--custom_jsdoc_tags=event,fires,function,classdesc,api,observable',
|
||||
'--strict',
|
||||
t.newer(t.dependencies))
|
||||
t.touch()
|
||||
|
||||
virtual('jshint', 'build/jshint-timestamp')
|
||||
|
||||
@target('build/jshint-timestamp', SRC, EXAMPLES_SRC_JS, SPEC, SPEC_RENDERING,
|
||||
'examples/resources/common.js', TASKS, NPM_INSTALL, precious=True)
|
||||
def build_jshint_timestamp(t):
|
||||
t.run(variables.JSHINT, '--verbose', t.newer(t.dependencies))
|
||||
t.touch()
|
||||
|
||||
|
||||
def _strip_comments(lines):
|
||||
# FIXME this is a horribe hack, we should use a proper JavaScript parser
|
||||
# here
|
||||
in_multiline_comment = False
|
||||
lineno = 0
|
||||
for line in lines:
|
||||
lineno += 1
|
||||
if in_multiline_comment:
|
||||
index = line.find('*/')
|
||||
if index != -1:
|
||||
in_multiline_comment = False
|
||||
line = line[index + 2:]
|
||||
if not in_multiline_comment:
|
||||
line = re.sub(r'//[^\n]*', '', line)
|
||||
line = re.sub(r'/\*.*?\*/', '', line)
|
||||
index = line.find('/*')
|
||||
if index != -1:
|
||||
yield lineno, line[:index]
|
||||
in_multiline_comment = True
|
||||
else:
|
||||
yield lineno, line
|
||||
|
||||
|
||||
@target('build/check-requires-timestamp', SRC, EXAMPLES_SRC_JS, SHADER_SRC,
|
||||
SPEC, SPEC_RENDERING)
|
||||
def build_check_requires_timestamp(t):
|
||||
unused_count = 0
|
||||
all_provides = set()
|
||||
closure_lib_path = output('node', '-e',
|
||||
'process.stdout.write(require("closure-util").getLibraryPath())')
|
||||
for filename in ifind(closure_lib_path):
|
||||
if filename.endswith('.js'):
|
||||
if not re.match(r'.*/closure/goog/', filename):
|
||||
continue
|
||||
# Skip goog.i18n because it contains so many modules that it causes
|
||||
# the generated regular expression to exceed Python's limits
|
||||
if re.match(r'.*/closure/goog/i18n/', filename):
|
||||
continue
|
||||
for line in open(filename, 'rU'):
|
||||
m = re.match(r'goog.provide\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
all_provides.add(m.group(1))
|
||||
for filename in sorted(t.dependencies):
|
||||
require_linenos = {}
|
||||
uses = set()
|
||||
lines = open(filename, 'rU').readlines()
|
||||
for lineno, line in _strip_comments(lines):
|
||||
m = re.match(r'goog.provide\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
all_provides.add(m.group(1))
|
||||
continue
|
||||
m = re.match(r'goog.require\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
require_linenos[m.group(1)] = lineno
|
||||
continue
|
||||
ignore_linenos = require_linenos.values()
|
||||
for lineno, line in enumerate(lines):
|
||||
if lineno in ignore_linenos:
|
||||
continue
|
||||
for require in require_linenos.iterkeys():
|
||||
if require in line:
|
||||
uses.add(require)
|
||||
for require in sorted(set(require_linenos.keys()) - uses):
|
||||
t.info('%s:%d: unused goog.require: %r' % (
|
||||
filename, require_linenos[require], require))
|
||||
unused_count += 1
|
||||
all_provides.discard('ol')
|
||||
all_provides.discard('ol.MapProperty')
|
||||
|
||||
class Node(object):
|
||||
|
||||
def __init__(self):
|
||||
self.present = False
|
||||
self.children = {}
|
||||
|
||||
def _build_re(self, key):
|
||||
if key == '*':
|
||||
assert len(self.children) == 0
|
||||
# We want to match `.doIt` but not `.SomeClass` or `.more.stuff`
|
||||
return '(?=\\.[a-z]\\w*\\b(?!\\.))'
|
||||
elif len(self.children) == 1:
|
||||
child_key, child = next(self.children.iteritems())
|
||||
child_re = child._build_re(child_key)
|
||||
if child_key != '*':
|
||||
child_re = '\\.' + child_re
|
||||
if self.present:
|
||||
return key + '(' + child_re + ')?'
|
||||
else:
|
||||
return key + child_re
|
||||
elif self.children:
|
||||
children_re = '(?:' + '|'.join(
|
||||
('\\.' if k != '*' else '') + self.children[k]._build_re(k)
|
||||
for k in sorted(self.children.keys())) + ')'
|
||||
if self.present:
|
||||
return key + children_re + '?'
|
||||
else:
|
||||
return key + children_re
|
||||
else:
|
||||
assert self.present
|
||||
return key
|
||||
|
||||
def build_re(self, key):
|
||||
return re.compile('\\b' + self._build_re(key) + '\\b')
|
||||
root = Node()
|
||||
for provide in all_provides:
|
||||
node = root
|
||||
for component in provide.split('.'):
|
||||
if component not in node.children:
|
||||
node.children[component] = Node()
|
||||
node = node.children[component]
|
||||
if component[0].islower():
|
||||
# We've arrived at a namespace provide like `ol.foo`.
|
||||
# In this case, we want to match uses like `ol.foo.doIt()` but
|
||||
# not match things like `new ol.foo.SomeClass()`.
|
||||
# For this purpose, we use the special wildcard key for the child.
|
||||
node.children['*'] = Node()
|
||||
else:
|
||||
node.present = True
|
||||
provide_res = [child.build_re(key)
|
||||
for key, child in root.children.iteritems()]
|
||||
missing_count = 0
|
||||
for filename in sorted(t.dependencies):
|
||||
provides = set()
|
||||
requires = set()
|
||||
uses = set()
|
||||
uses_linenos = {}
|
||||
for lineno, line in _strip_comments(open(filename, 'rU')):
|
||||
m = re.match(r'goog.provide\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
provides.add(m.group(1))
|
||||
continue
|
||||
m = re.match(r'goog.require\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
requires.add(m.group(1))
|
||||
continue
|
||||
while True:
|
||||
for provide_re in provide_res:
|
||||
m = provide_re.search(line)
|
||||
if m:
|
||||
uses.add(m.group())
|
||||
uses_linenos[m.group()] = lineno
|
||||
line = line[:m.start()] + line[m.end():]
|
||||
break
|
||||
else:
|
||||
break
|
||||
if filename == 'src/ol/renderer/layerrenderer.js':
|
||||
uses.discard('ol.renderer.Map')
|
||||
m = re.match(
|
||||
r'src/ol/renderer/(\w+)/\1(\w*)layerrenderer\.js\Z', filename)
|
||||
if m:
|
||||
uses.discard('ol.renderer.Map')
|
||||
uses.discard('ol.renderer.%s.Map' % (m.group(1),))
|
||||
missing_requires = uses - requires - provides
|
||||
if missing_requires:
|
||||
for missing_require in sorted(missing_requires):
|
||||
t.info("%s:%d missing goog.require('%s')" %
|
||||
(filename, uses_linenos[missing_require], missing_require))
|
||||
missing_count += 1
|
||||
if unused_count or missing_count:
|
||||
t.error('%d unused goog.requires, %d missing goog.requires' %
|
||||
(unused_count, missing_count))
|
||||
t.touch()
|
||||
|
||||
|
||||
@target('build/check-whitespace-timestamp', SRC, EXAMPLES_SRC_JS,
|
||||
SPEC, SPEC_RENDERING, JSDOC_SRC, precious=True)
|
||||
def build_check_whitespace_timestamp(t):
|
||||
CR_RE = re.compile(r'\r')
|
||||
LEADING_WHITESPACE_RE = re.compile(r'\s+')
|
||||
TRAILING_WHITESPACE_RE = re.compile(r'\s+\n\Z')
|
||||
NO_NEWLINE_RE = re.compile(r'[^\n]\Z')
|
||||
ALL_WHITESPACE_RE = re.compile(r'\s+\Z')
|
||||
errors = 0
|
||||
for filename in sorted(t.newer(t.dependencies)):
|
||||
whitespace = False
|
||||
for lineno, line in enumerate(open(filename, 'rU')):
|
||||
if lineno == 0 and LEADING_WHITESPACE_RE.match(line):
|
||||
t.info('%s:%d: leading whitespace', filename, lineno + 1)
|
||||
errors += 1
|
||||
if CR_RE.search(line):
|
||||
t.info('%s:%d: carriage return character in line', filename, lineno + 1)
|
||||
errors += 1
|
||||
if TRAILING_WHITESPACE_RE.search(line):
|
||||
t.info('%s:%d: trailing whitespace', filename, lineno + 1)
|
||||
errors += 1
|
||||
if NO_NEWLINE_RE.search(line):
|
||||
t.info('%s:%d: no newline at end of file', filename, lineno + 1)
|
||||
errors += 1
|
||||
whitespace = ALL_WHITESPACE_RE.match(line)
|
||||
if whitespace:
|
||||
t.info('%s: trailing whitespace at end of file', filename)
|
||||
errors += 1
|
||||
if errors:
|
||||
t.error('%d whitespace errors' % (errors,))
|
||||
t.touch()
|
||||
|
||||
|
||||
virtual('apidoc', 'build/jsdoc-%(BRANCH)s-timestamp' % vars(variables))
|
||||
|
||||
|
||||
@target('build/jsdoc-%(BRANCH)s-timestamp' % vars(variables),
|
||||
SRC, SHADER_SRC, ifind('config/jsdoc/api/template'),
|
||||
NPM_INSTALL)
|
||||
def jsdoc_BRANCH_timestamp(t):
|
||||
t.run('%(JSDOC)s', 'config/jsdoc/api/index.md',
|
||||
'-c', 'config/jsdoc/api/conf.json',
|
||||
'-d', 'build/hosted/%(BRANCH)s/apidoc')
|
||||
t.touch()
|
||||
|
||||
|
||||
def split_example_file(example, dst_dir):
|
||||
lines = open(example, 'rU').readlines()
|
||||
|
||||
target_lines = []
|
||||
target_require_lines = []
|
||||
|
||||
found_requires = False
|
||||
found_code = False
|
||||
for line in lines:
|
||||
m = re.match(r'goog.require\(\'(.*)\'\);', line)
|
||||
if m:
|
||||
found_requires = True
|
||||
target_require_lines.append(line)
|
||||
elif found_requires:
|
||||
if found_code or line not in ('\n', '\r\n'):
|
||||
found_code = True
|
||||
target_lines.append(line)
|
||||
|
||||
target = open(
|
||||
os.path.join(dst_dir, os.path.basename(example)), 'wb')
|
||||
target_require = open(
|
||||
os.path.join(dst_dir, os.path.basename(example)
|
||||
.replace('.js', '-require.js')),
|
||||
'wb')
|
||||
|
||||
target.writelines(target_lines)
|
||||
target.close()
|
||||
|
||||
target_require.writelines(target_require_lines)
|
||||
target_require.close()
|
||||
|
||||
|
||||
@target('host-examples', 'build', 'examples', phony=True)
|
||||
def host_examples(t):
|
||||
examples_dir = 'build/hosted/%(BRANCH)s/examples'
|
||||
build_dir = 'build/hosted/%(BRANCH)s/build'
|
||||
css_dir = 'build/hosted/%(BRANCH)s/css'
|
||||
closure_lib_path = output('node', '-e',
|
||||
'process.stdout.write(require("closure-util").getLibraryPath())')
|
||||
t.rm_rf(examples_dir)
|
||||
t.cp_r('build/examples', examples_dir)
|
||||
for example in EXAMPLES_SRC_JS:
|
||||
split_example_file(example, examples_dir % vars(variables))
|
||||
t.cp('bin/loader_hosted_examples.js', examples_dir + '/loader.js')
|
||||
t.rm_rf(build_dir)
|
||||
t.makedirs(build_dir)
|
||||
t.rm_rf(css_dir)
|
||||
t.makedirs(css_dir)
|
||||
t.cp('build/ol.js', 'build/ol-debug.js', build_dir)
|
||||
t.cp('build/ol.css', css_dir)
|
||||
t.rm_rf('build/hosted/%(BRANCH)s/closure-library')
|
||||
t.cp_r(closure_lib_path, 'build/hosted/%(BRANCH)s/closure-library')
|
||||
t.rm_rf('build/hosted/%(BRANCH)s/ol')
|
||||
t.makedirs('build/hosted/%(BRANCH)s/ol')
|
||||
t.cp_r('src/ol', 'build/hosted/%(BRANCH)s/ol/ol')
|
||||
t.rm_rf('build/hosted/%(BRANCH)s/ol.ext')
|
||||
t.cp_r('build/ol.ext', 'build/hosted/%(BRANCH)s/ol.ext')
|
||||
t.run('%(PYTHON)s', closure_lib_path + '/closure/bin/build/depswriter.py',
|
||||
'--root_with_prefix', 'src ../../../ol',
|
||||
'--root_with_prefix', 'build/ol.ext ../../../ol.ext',
|
||||
'--root', 'build/hosted/%(BRANCH)s/closure-library/closure/goog',
|
||||
'--root_with_prefix', 'build/hosted/%(BRANCH)s/closure-library/'
|
||||
'third_party ../../third_party',
|
||||
'--output_file', 'build/hosted/%(BRANCH)s/build/ol-deps.js')
|
||||
|
||||
|
||||
@target('check-examples', 'host-examples', phony=True)
|
||||
def check_examples(t):
|
||||
examples = ['build/hosted/%(BRANCH)s/' + e
|
||||
for e in EXAMPLES_SRC_HTML
|
||||
if not open(e.replace('.html', '.js'), 'rU').readline().startswith('// NOCOMPILE')]
|
||||
all_examples = [e + '?mode=advanced' for e in examples]
|
||||
# Run the examples checks in a pool of threads
|
||||
pool = ThreadPool()
|
||||
for example in all_examples:
|
||||
pool.add_task(t.run, '%(PHANTOMJS)s', '--ssl-protocol=any',
|
||||
'--ignore-ssl-errors=true', 'bin/check-example.js', example)
|
||||
errors = pool.wait_completion()
|
||||
if errors:
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
@target('test', NPM_INSTALL, 'build/test_requires.js', phony=True)
|
||||
def test(t):
|
||||
t.run('node', 'tasks/test.js')
|
||||
|
||||
|
||||
@target('test-coverage', NPM_INSTALL, phony=True)
|
||||
def test_coverage(t):
|
||||
t.run('node', 'tasks/test-coverage.js')
|
||||
|
||||
|
||||
@target('test-rendering', 'build/test_rendering_requires.js',
|
||||
NPM_INSTALL, phony=True)
|
||||
def test_rendering(t):
|
||||
# create a temp. profile to run the tests with WebGL
|
||||
tmp_profile_dir = 'build/slimerjs-profile'
|
||||
t.rm_rf(tmp_profile_dir)
|
||||
t.cp_r('test_rendering/slimerjs-profile', tmp_profile_dir)
|
||||
t.run('node', 'tasks/test-rendering.js')
|
||||
|
||||
|
||||
@target('fixme', phony=True)
|
||||
def find_fixme(t):
|
||||
regex = re.compile('FIXME|TODO')
|
||||
matches = dict()
|
||||
totalcount = 0
|
||||
for filename in SRC:
|
||||
f = open(filename, 'r')
|
||||
for lineno, line in enumerate(f):
|
||||
if regex.search(line):
|
||||
if (filename not in matches):
|
||||
matches[filename] = list()
|
||||
matches[filename].append('#%-10d %s' % (
|
||||
lineno + 1, line.strip()))
|
||||
totalcount += 1
|
||||
f.close()
|
||||
|
||||
for filename in matches:
|
||||
num_matches = len(matches[filename])
|
||||
noun = 'matches' if num_matches > 1 else 'match'
|
||||
print ' %s has %d %s:' % (filename, num_matches, noun)
|
||||
for match in matches[filename]:
|
||||
print ' %s' % (match,)
|
||||
print
|
||||
print 'A total of %d TODO/FIXME(s) were found' % (totalcount,)
|
||||
|
||||
|
||||
@target('reallyclean')
|
||||
def reallyclean(t):
|
||||
"""Removes untracked files and folders from previous builds."""
|
||||
# -X => only clean up files that are usually ignored e.g.
|
||||
# through .gitignore
|
||||
# -d => also consider directories for deletion
|
||||
# -f => if git configuration variable clean.requireForce != false,
|
||||
# git clean will refuse to run unless given -f or -n.
|
||||
t.run('%(GIT)s', 'clean', '-X', '-d', '-f', '.')
|
||||
|
||||
|
||||
@target('checkdeps')
|
||||
def check_dependencies(t):
|
||||
for exe in EXECUTABLES:
|
||||
status = 'present' if which(exe) else 'MISSING'
|
||||
print 'Program "%s" seems to be %s.' % (exe, status)
|
||||
print 'For certain targets all above programs need to be present.'
|
||||
|
||||
|
||||
@target('help')
|
||||
def display_help(t):
|
||||
print '''
|
||||
build.py - The OpenLayers 3 build script.
|
||||
|
||||
Usage:
|
||||
./build.py [options] [target] (on Unix-based machines)
|
||||
<python-executable.exe> build.py [options] [target] (on Windows machines)
|
||||
|
||||
There is one option:
|
||||
-c - Cleans up the repository from previous builds.
|
||||
|
||||
The most common targets are:
|
||||
serve - Serves files, on port 3000.
|
||||
lint - Runs gjslint on all sourcefiles to enforce specific syntax.
|
||||
build - Builds singlefile versions of OpenLayers JavaScript and
|
||||
CSS. This is also the default build target which runs when
|
||||
no target is specified.
|
||||
test - Runs the testsuite and displays the results.
|
||||
test-rendering - Runs the rendering testsuite and displays the results.
|
||||
check - Runs the lint-target, builds some OpenLayers files, and
|
||||
then runs test. Many developers call this target often
|
||||
while working on the code.
|
||||
help - Shows this help.
|
||||
|
||||
Other less frequently used targets are:
|
||||
apidoc - Builds the API-Documentation using JSDoc3.
|
||||
ci - Builds all examples in various modes and usually takes a
|
||||
long time to finish. This target calls the following
|
||||
targets: 'lint', 'build', 'test', 'test-rendering',
|
||||
'build/compiled-examples/all.combined.js', 'check-examples',
|
||||
and 'apidoc'. This is the target run on Travis CI.
|
||||
test-coverage - Generates a test coverage report in the coverage folder.
|
||||
reallyclean - Remove untracked files from the repository.
|
||||
checkdeps - Checks whether all required development software is
|
||||
installed on your machine.
|
||||
fixme - Will print a list of parts of the code that are marked
|
||||
with either TODO or FIXME.
|
||||
todo - This is an alias for the fixme-target (see above).
|
||||
|
||||
If no target is given, the build-target will be executed.
|
||||
|
||||
The above list is not complete, please see the source code for not-mentioned
|
||||
and only seldom called targets.
|
||||
'''
|
||||
|
||||
if __name__ == '__main__':
|
||||
signal.signal(signal.SIGINT, sigint_handler)
|
||||
main()
|
||||
79
config/example.json
Normal file
79
config/example.json
Normal file
@@ -0,0 +1,79 @@
|
||||
{
|
||||
"exports": [],
|
||||
"src": [
|
||||
"src/**/*.js",
|
||||
"build/ol.ext/*.js",
|
||||
"build/examples/{{id}}.js"
|
||||
],
|
||||
"compile": {
|
||||
"js": [
|
||||
"externs/olx.js",
|
||||
"externs/oli.js"
|
||||
],
|
||||
"externs": [
|
||||
"externs/bingmaps.js",
|
||||
"externs/bootstrap.js",
|
||||
"externs/closure-compiler.js",
|
||||
"externs/example.js",
|
||||
"externs/fastclick.js",
|
||||
"externs/geojson.js",
|
||||
"externs/jquery-1.9.js",
|
||||
"externs/proj4js.js",
|
||||
"externs/tilejson.js",
|
||||
"externs/topojson.js",
|
||||
"externs/vbarray.js"
|
||||
],
|
||||
"define": [
|
||||
"goog.array.ASSUME_NATIVE_FUNCTIONS=true",
|
||||
"goog.dom.ASSUME_STANDARDS_MODE=true",
|
||||
"goog.json.USE_NATIVE_JSON=true",
|
||||
"goog.DEBUG=false"
|
||||
],
|
||||
"jscomp_error": [
|
||||
"accessControls",
|
||||
"ambiguousFunctionDecl",
|
||||
"checkEventfulObjectDisposal",
|
||||
"checkRegExp",
|
||||
"checkStructDictInheritance",
|
||||
"checkTypes",
|
||||
"checkVars",
|
||||
"const",
|
||||
"constantProperty",
|
||||
"deprecated",
|
||||
"duplicateMessage",
|
||||
"es3",
|
||||
"es5Strict",
|
||||
"externsValidation",
|
||||
"fileoverviewTags",
|
||||
"globalThis",
|
||||
"internetExplorerChecks",
|
||||
"invalidCasts",
|
||||
"misplacedTypeAnnotation",
|
||||
"missingGetCssName",
|
||||
"missingProperties",
|
||||
"missingProvide",
|
||||
"missingRequire",
|
||||
"missingReturn",
|
||||
"newCheckTypes",
|
||||
"nonStandardJsDocs",
|
||||
"suspiciousCode",
|
||||
"strictModuleDepCheck",
|
||||
"typeInvalidation",
|
||||
"undefinedNames",
|
||||
"undefinedVars",
|
||||
"uselessCode",
|
||||
"visibility"
|
||||
],
|
||||
"jscomp_off": [
|
||||
"unknownDefines"
|
||||
],
|
||||
"extra_annotation_name": [
|
||||
"api", "observable"
|
||||
],
|
||||
"compilation_level": "ADVANCED",
|
||||
"warning_level": "VERBOSE",
|
||||
"output_wrapper": "(function(){%output%})();",
|
||||
"use_types_for_optimization": true,
|
||||
"manage_closure_dependencies": true
|
||||
}
|
||||
}
|
||||
@@ -12,8 +12,7 @@
|
||||
"install": "node tasks/install.js",
|
||||
"postinstall": "closure-util update",
|
||||
"start": "node tasks/serve.js",
|
||||
"test": "node tasks/test.js",
|
||||
"test-coverage": "node tasks/test-coverage.js"
|
||||
"test": "node tasks/test.js"
|
||||
},
|
||||
"main": "dist/ol.js",
|
||||
"repository": {
|
||||
|
||||
509
pake.py
509
pake.py
@@ -1,509 +0,0 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
import collections
|
||||
import contextlib
|
||||
import hashlib
|
||||
import logging
|
||||
import optparse
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
import subprocess
|
||||
import tempfile
|
||||
import sys
|
||||
import time
|
||||
import urllib2
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
if hasattr(subprocess, 'check_output'):
|
||||
check_output = subprocess.check_output
|
||||
else:
|
||||
# Copied with minor modifications from the Python source code
|
||||
# http://hg.python.org/cpython/file/9cb1366b251b/Lib/subprocess.py#l549
|
||||
def check_output(*popenargs, **kwargs):
|
||||
if 'stdout' in kwargs:
|
||||
raise ValueError(
|
||||
'stdout argument not allowed, it will be overridden.')
|
||||
process = subprocess.Popen(stdout=subprocess.PIPE,
|
||||
*popenargs, **kwargs)
|
||||
output, unused_err = process.communicate()
|
||||
retcode = process.poll()
|
||||
if retcode:
|
||||
cmd = kwargs.get("args")
|
||||
if cmd is None:
|
||||
cmd = popenargs[0]
|
||||
raise subprocess.CalledProcessError(retcode, cmd, output=output)
|
||||
return output
|
||||
|
||||
|
||||
class PakeError(RuntimeError):
|
||||
pass
|
||||
|
||||
|
||||
class AmbiguousRuleError(PakeError):
|
||||
|
||||
def __init__(self, name):
|
||||
self.name = name
|
||||
|
||||
def __str__(self):
|
||||
return '%r matches multiple rules' % (self.name,)
|
||||
|
||||
|
||||
class BuildError(PakeError):
|
||||
|
||||
def __init__(self, target, message):
|
||||
self.target = target
|
||||
self.message = message
|
||||
|
||||
def __str__(self):
|
||||
return '%s: %s' % (self.target.name, self.message)
|
||||
|
||||
|
||||
class DuplicateTargetError(PakeError):
|
||||
|
||||
def __init__(self, target):
|
||||
self.target = target
|
||||
|
||||
def __str__(self):
|
||||
return 'duplicate target %r' % (self.target.name,)
|
||||
|
||||
|
||||
class UnknownTargetError(PakeError):
|
||||
|
||||
def __init__(self, name):
|
||||
self.name = name
|
||||
|
||||
def __str__(self):
|
||||
return 'unknown target %r' % (self.name,)
|
||||
|
||||
|
||||
class Target(object):
|
||||
"""Target is the core object of pake. It includes all of the target's name
|
||||
(which may or may not correspond to a real file in the filesystem, see the
|
||||
comments in virtual and TargetCollection below), the action to be performed
|
||||
when this target is to be rebuilt, its dependencies, and various other
|
||||
metadata."""
|
||||
|
||||
def __init__(self, name, action=None, clean=True, dependencies=(),
|
||||
help=None, help_group=None, makedirs=True, phony=False,
|
||||
precious=False):
|
||||
self.name = name
|
||||
self.action = action
|
||||
self._clean = clean
|
||||
self.dependencies = list(flatten(dependencies))
|
||||
self.help = help
|
||||
self.help_group = help_group
|
||||
self._makedirs = makedirs
|
||||
self.phony = phony
|
||||
self.precious = precious
|
||||
self.logger = logging.getLogger(self.name)
|
||||
self.timestamp = None
|
||||
|
||||
def build(self, dry_run=False):
|
||||
timestamp = 0
|
||||
for dependency in self.dependencies:
|
||||
target = targets.get(dependency)
|
||||
timestamp = max(timestamp, target.build(dry_run=dry_run))
|
||||
self.debug('build')
|
||||
if self.timestamp is None:
|
||||
if not self.phony and os.path.exists(self.name):
|
||||
self.timestamp = os.stat(self.name).st_mtime
|
||||
else:
|
||||
self.timestamp = -1
|
||||
if self.timestamp < timestamp:
|
||||
self.debug('action')
|
||||
if self._makedirs and not dry_run:
|
||||
self.makedirs(os.path.dirname(self.name))
|
||||
if self.action:
|
||||
if self.action.__doc__:
|
||||
self.info(self.action.__doc__)
|
||||
if not dry_run:
|
||||
self.action(self)
|
||||
self.timestamp = timestamp or time.time()
|
||||
return self.timestamp
|
||||
|
||||
@contextlib.contextmanager
|
||||
def chdir(self, dir):
|
||||
cwd = os.getcwd()
|
||||
dir = dir % vars(variables)
|
||||
self.info('cd %s', dir)
|
||||
os.chdir(dir)
|
||||
try:
|
||||
yield dir
|
||||
finally:
|
||||
self.info('cd %s', cwd)
|
||||
os.chdir(cwd)
|
||||
|
||||
def cp(self, *args):
|
||||
args = flatten_expand_list(args)
|
||||
dest = args.pop()
|
||||
for arg in args:
|
||||
self.info('cp %s %s', arg, dest)
|
||||
shutil.copy(arg, dest)
|
||||
|
||||
def cp_r(self, *args):
|
||||
args = flatten_expand_list(args)
|
||||
dest = args.pop()
|
||||
for arg in args:
|
||||
self.info('cp -r %s %s', arg, dest)
|
||||
shutil.copytree(arg, dest)
|
||||
|
||||
def clean(self, really=False, recurse=True):
|
||||
if (self._clean or really) and not self.precious:
|
||||
self.info('clean')
|
||||
try:
|
||||
os.remove(self.name)
|
||||
except OSError:
|
||||
pass
|
||||
if recurse:
|
||||
for dependency in self.dependencies:
|
||||
targets.get(dependency).clean(really=really, recurse=recurse)
|
||||
|
||||
def debug(self, *args, **kwargs):
|
||||
self.logger.debug(*args, **kwargs)
|
||||
|
||||
def download(self, url, md5=None, sha1=None):
|
||||
content = urllib2.urlopen(url).read()
|
||||
if md5 and hashlib.md5(content).hexdigest() != md5:
|
||||
raise BuildError(self, 'corrupt download')
|
||||
if sha1 and hashlib.sha1(content).hexdigest() != sha1:
|
||||
raise BuildError(self, 'corrupt download')
|
||||
with open(self.name, 'wb') as f:
|
||||
f.write(content)
|
||||
|
||||
def error(self, message):
|
||||
raise BuildError(self, message)
|
||||
|
||||
def graph(self, f, visited):
|
||||
if self in visited:
|
||||
return
|
||||
visited.add(self)
|
||||
for dependency in self.dependencies:
|
||||
target = targets.get(dependency)
|
||||
f.write('\t"%s" -> "%s";\n' % (self.name, target.name))
|
||||
target.graph(f, visited)
|
||||
|
||||
def info(self, *args, **kwargs):
|
||||
self.logger.info(*args, **kwargs)
|
||||
|
||||
def makedirs(self, path):
|
||||
path = path % vars(variables)
|
||||
if path and not os.path.exists(path):
|
||||
self.info('mkdir -p %s', path)
|
||||
os.makedirs(path)
|
||||
|
||||
def newer(self, *args):
|
||||
args = flatten_expand_list(args)
|
||||
return [arg for arg in args
|
||||
if targets.get(arg).timestamp > self.timestamp]
|
||||
|
||||
def output(self, *args, **kwargs):
|
||||
"""output runs the command passed to it, saving the output of the
|
||||
command to the contents of the target. For example:
|
||||
@target('ofile')
|
||||
def ofile(t):
|
||||
t.output('echo', '123')
|
||||
After this target's action is executed, ofile will contain the string
|
||||
"123"."""
|
||||
args = flatten_expand_list(args)
|
||||
self.info(' '.join(args))
|
||||
try:
|
||||
output = check_output(args, **kwargs)
|
||||
with open(self.name, 'wb') as f:
|
||||
f.write(output)
|
||||
except subprocess.CalledProcessError as e:
|
||||
self.clean(recurse=False)
|
||||
self.error(e)
|
||||
|
||||
def rm_rf(self, *args):
|
||||
"""rm_rf recursively deletes the files and/or directories passed to
|
||||
it."""
|
||||
args = flatten_expand_list(args)
|
||||
for arg in args:
|
||||
self.info('rm -rf %s', arg)
|
||||
shutil.rmtree(arg, ignore_errors=True)
|
||||
|
||||
def run(self, *args, **kwargs):
|
||||
args = flatten_expand_list(args)
|
||||
self.info(' '.join(args))
|
||||
try:
|
||||
subprocess.check_call(args, **kwargs)
|
||||
except subprocess.CalledProcessError as e:
|
||||
self.clean(recurse=False)
|
||||
self.error(e)
|
||||
|
||||
@contextlib.contextmanager
|
||||
def tempdir(self):
|
||||
"""tempdir creates a temporary directory, changes to it, and runs the
|
||||
nested block of code. However the nested block of code exits, tempdir
|
||||
will delete the temporary directory permanently, before pake exits. For
|
||||
example:
|
||||
with t.tempdir():
|
||||
# copy various files to $PWD (the temporary directory)
|
||||
# zip up the contents of $PWD, or copy them somewhere else
|
||||
However the above code exits (e.g. copy error or zip error), the
|
||||
temporary directory will be cleaned up."""
|
||||
tempdir = tempfile.mkdtemp()
|
||||
self.info('mkdir -p %s', tempdir)
|
||||
try:
|
||||
yield tempdir
|
||||
finally:
|
||||
self.info('rm -rf %s', tempdir)
|
||||
shutil.rmtree(tempdir, ignore_errors=True)
|
||||
|
||||
def touch(self):
|
||||
"""touch updates the timestamp of the target. If the target already
|
||||
exists as a file in the filesystem its timestamp is updated, otherwise
|
||||
a new file is created with the current timestamp."""
|
||||
if os.path.exists(self.name):
|
||||
os.utime(self.name, None)
|
||||
else:
|
||||
with open(self.name, 'wb'):
|
||||
pass
|
||||
|
||||
|
||||
class TargetCollection(object):
|
||||
"""TargetCollection implements a namespace for looking up build targets.
|
||||
TargetCollection will first look for rules that match exactly, and then
|
||||
- if no match is found - search through a list of regular expression-based
|
||||
rules. As soon as a regular expression match is found, that rule is added
|
||||
to the list of rules that match exactly. Typically, an invocation of pake
|
||||
will only create a single TargetCollection."""
|
||||
|
||||
def __init__(self):
|
||||
self.default = None
|
||||
self.targets = {}
|
||||
|
||||
def add(self, target):
|
||||
"""add adds a concrete target to self, raising an error if the target
|
||||
already exists. If target is the first target to be added, it becomes
|
||||
the default for this TargetCollection."""
|
||||
if target.name in self.targets:
|
||||
raise DuplicateTargetError(target)
|
||||
self.targets[target.name] = target
|
||||
if self.default is None:
|
||||
self.default = target
|
||||
|
||||
def get(self, name):
|
||||
"""get searches for a target. If it already exists, it is returned.
|
||||
Otherwise, get searches through the defined rules, trying to find a
|
||||
rule that matches. If it finds a matching rule, a concrete target is
|
||||
instantiated, cached, and returned. If no match is found, a virtual
|
||||
precious target is instantiated and returned."""
|
||||
if name in self.targets:
|
||||
return self.targets[name]
|
||||
target = None
|
||||
for regexp, f in rules.iteritems():
|
||||
match = regexp.search(name)
|
||||
if not match:
|
||||
continue
|
||||
if target is not None:
|
||||
raise AmbiguousRuleError(name)
|
||||
target = f(name, match)
|
||||
if target is None:
|
||||
if os.path.exists(name):
|
||||
target = Target(name, precious=True)
|
||||
else:
|
||||
raise UnknownTargetError(name)
|
||||
self.targets[name] = target
|
||||
return target
|
||||
|
||||
def format_epilog(self, formatter):
|
||||
helps_by_help_group = collections.defaultdict(dict)
|
||||
max_name_len = 0
|
||||
for name in sorted(self.targets):
|
||||
target = self.targets[name]
|
||||
if target.help is not None:
|
||||
helps_by_help_group[target.help_group][name] = target.help
|
||||
max_name_len = max(max_name_len, len(name))
|
||||
lines = []
|
||||
lines.append('Targets:\n')
|
||||
format = ' %%-%ds %%s\n' % (max_name_len,)
|
||||
for help_group in sorted(helps_by_help_group.keys()):
|
||||
helps = helps_by_help_group[help_group]
|
||||
if help_group is not None:
|
||||
lines.append('%s targets:\n' % (help_group,))
|
||||
for name in sorted(helps.keys()):
|
||||
lines.append(format % (name, helps[name]))
|
||||
return ''.join(lines)
|
||||
|
||||
|
||||
class VariableCollection(object):
|
||||
"""VariableCollection implements an object with properties where the first
|
||||
set of a property wins, and all further sets are ignored. For example:
|
||||
vc = VariableCollection()
|
||||
vc.FOO = 1 # First set of the property FOO
|
||||
vc.FOO = 2 # Further sets of the property FOO are ignored, and do
|
||||
# not raise an error. After this statement, vc.FOO is
|
||||
# still 1.
|
||||
print vc.FOO # Prints "1" """
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
for key, value in kwargs.iteritems():
|
||||
setattr(self, key, value)
|
||||
|
||||
def __setattr__(self, key, value):
|
||||
"""Only set an attribute if it has not already been set. First to set
|
||||
the value is the winner."""
|
||||
if not hasattr(self, key):
|
||||
object.__setattr__(self, key, value)
|
||||
|
||||
|
||||
# targets is the single TargetCollection instance created for this invokation
|
||||
# of pake
|
||||
targets = TargetCollection()
|
||||
# rules is a dict of regular expressions to @rules where dynamically created
|
||||
# rules are registered.
|
||||
rules = {}
|
||||
# variables is the global set of substitution variables, where the first setter
|
||||
# takes priority. The priority order is:
|
||||
# 1. Environment variables
|
||||
# 2. Command line arguments
|
||||
# 3. Internal Python settings in build.py
|
||||
variables = VariableCollection(**os.environ)
|
||||
|
||||
|
||||
def flatten(*args):
|
||||
"""flatten takes a variable number of arguments, each of which may or may
|
||||
be not be a collection.Iterable, and yields the elements of each in
|
||||
depth-first order. In short, it flattens nested iterables into a single
|
||||
collection. For example, flatten(1, [2, (3, 4), 5], 6) yields 1, 2, 3, 4,
|
||||
5, 6."""
|
||||
for arg in args:
|
||||
if (isinstance(arg, collections.Iterable) and
|
||||
not isinstance(arg, basestring)):
|
||||
for element in flatten(*arg):
|
||||
yield element
|
||||
else:
|
||||
yield arg
|
||||
|
||||
|
||||
def flatten_expand_list(*args):
|
||||
"""flatten_expand_list applies flatten, treats each element as a string,
|
||||
and formats each string according to the global value of variables."""
|
||||
return list(arg % vars(variables) for arg in flatten(args))
|
||||
|
||||
|
||||
def ifind(*paths):
|
||||
"""ifind is an iterative version of os.walk, yielding all walked paths and
|
||||
normalizing paths to use forward slashes."""
|
||||
for path in paths:
|
||||
for dirpath, dirnames, names in os.walk(path):
|
||||
for name in names:
|
||||
if os.sep == '/':
|
||||
yield os.path.join(dirpath, name)
|
||||
else:
|
||||
yield '/'.join(dirpath.split(os.sep) + [name])
|
||||
|
||||
|
||||
def main(argv=sys.argv):
|
||||
option_parser = optparse.OptionParser()
|
||||
option_parser.add_option('-c', '--clean',
|
||||
action='store_true')
|
||||
option_parser.add_option('-g', '--graph',
|
||||
action='store_true')
|
||||
option_parser.add_option('-n', '--dry-run', '--just-print', '--recon',
|
||||
action='store_true')
|
||||
option_parser.add_option('-r', '--really',
|
||||
action='store_true')
|
||||
option_parser.add_option('-v', '--verbose',
|
||||
action='count', dest='logging_level')
|
||||
option_parser.set_defaults(logging_level=0)
|
||||
option_parser.format_epilog = targets.format_epilog
|
||||
options, args = option_parser.parse_args(argv[1:])
|
||||
logging.basicConfig(format='%(asctime)s %(name)s: %(message)s',
|
||||
level=logging.INFO - 10 * options.logging_level)
|
||||
targets_ = []
|
||||
for arg in args:
|
||||
match = re.match(r'(?P<key>\w+)=(?P<value>.*)\Z', arg)
|
||||
if match:
|
||||
key, value = match.group('key', 'value')
|
||||
if not hasattr(variables, key):
|
||||
logger.error('%s is not a variable', key)
|
||||
logger.debug('%s=%r', key, value)
|
||||
object.__setattr__(variables, key, value)
|
||||
continue
|
||||
targets_.append(arg)
|
||||
if not targets_:
|
||||
targets_ = (targets.default.name,)
|
||||
try:
|
||||
for target in targets_:
|
||||
target = targets.get(target)
|
||||
if options.clean:
|
||||
target.clean(really=options.really, recurse=True)
|
||||
elif options.graph:
|
||||
sys.stdout.write('digraph "%s" {\n' % (target.name,))
|
||||
target.graph(sys.stdout, set())
|
||||
sys.stdout.write('}\n')
|
||||
else:
|
||||
target.build(dry_run=options.dry_run)
|
||||
except BuildError as e:
|
||||
logger.error(e)
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def output(*args):
|
||||
"""output captures the output of a single command. It is typically used to
|
||||
set variables that only need to be set once. For example:
|
||||
UNAME_A = output('uname', '-a')
|
||||
If you need to capture the output of a command in a target, you should use
|
||||
t.output."""
|
||||
args = flatten_expand_list(args)
|
||||
logger.debug(' '.join(args))
|
||||
return check_output(args)
|
||||
|
||||
|
||||
def rule(pattern):
|
||||
def f(targetmaker):
|
||||
rules[re.compile(pattern)] = targetmaker
|
||||
return f
|
||||
|
||||
|
||||
def target(name, *dependencies, **kwargs):
|
||||
"""The @target decorator describes the action needed to build a single
|
||||
target file when its dependencies are out of date. For example:
|
||||
@target('hello', 'hello.c')
|
||||
def hello(t):
|
||||
t.run('gcc', '-o', t.name, t.dependencies)
|
||||
# the above line will run gcc -o hello hello.c
|
||||
See the documentation for Target to see the properties provide by the
|
||||
target t."""
|
||||
def f(action):
|
||||
target = Target(name, action=action, dependencies=dependencies,
|
||||
**kwargs)
|
||||
targets.add(target)
|
||||
return f
|
||||
|
||||
|
||||
def virtual(name, *dependencies, **kwargs):
|
||||
"""virtual targets are metatargets. They do not correspond to any real
|
||||
file in the filesystem, even if a file with the same name already exists.
|
||||
Virtual targets can be thought of as only existing for the duration of the
|
||||
build. Their up-to-dateness or otherwise is independent of any existence
|
||||
or up-to-dateness of any actual file in the filesystem. Typically they are
|
||||
used to group actions such as "all", "build", or "test"."""
|
||||
target = Target(name, dependencies=dependencies, clean=False, phony=True,
|
||||
**kwargs)
|
||||
targets.add(target)
|
||||
|
||||
|
||||
def which(program):
|
||||
"""Returns the full path of a given argument or `None`.
|
||||
See:
|
||||
http://stackoverflow.com/questions/377017/test-if-executable-exists-in-python"""
|
||||
def is_exe(fpath):
|
||||
return os.path.isfile(fpath) and os.access(fpath, os.X_OK)
|
||||
fpath, fname = os.path.split(program)
|
||||
if fpath:
|
||||
if is_exe(program):
|
||||
return program
|
||||
else:
|
||||
for path in os.environ["PATH"].split(os.pathsep):
|
||||
path = path.strip('"')
|
||||
exe_file = os.path.join(path, program)
|
||||
if is_exe(exe_file):
|
||||
return exe_file
|
||||
return None
|
||||
40
tasks/generate-requires.js
Normal file
40
tasks/generate-requires.js
Normal file
@@ -0,0 +1,40 @@
|
||||
var fs = require('fs');
|
||||
|
||||
// The number of files that we need to generate goog.require's for.
|
||||
var numFiles = process.argv.length - 1;
|
||||
|
||||
/**
|
||||
* Object used a set of found goog.provide's.
|
||||
* @type {Object.<string, boolean>}
|
||||
*/
|
||||
var requires = {};
|
||||
|
||||
process.argv.forEach(function(val, index, array) {
|
||||
|
||||
if (index === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
fs.readFile(val, function(err, data) {
|
||||
if (err) {
|
||||
return;
|
||||
}
|
||||
|
||||
var re = new RegExp('goog\\.provide\\(\'(.*)\'\\);');
|
||||
|
||||
data.toString().split('\n').forEach(function(line) {
|
||||
var match = line.match(re);
|
||||
if (match) {
|
||||
requires[match[1]] = true;
|
||||
}
|
||||
});
|
||||
|
||||
if (--numFiles === 0) {
|
||||
Object.keys(requires).sort().forEach(function(key) {
|
||||
process.stdout.write('goog.require(\'' + key + '\');\n');
|
||||
});
|
||||
}
|
||||
|
||||
});
|
||||
|
||||
});
|
||||
@@ -16,13 +16,13 @@ Install the test dependencies (from the root of the repository):
|
||||
|
||||
Run the tests once with PhantomJS:
|
||||
|
||||
./build.py test
|
||||
make test
|
||||
|
||||
(Note that for `npm` users, this can also be run as `npm test`.)
|
||||
|
||||
Run the tests in a browser:
|
||||
|
||||
./build.py serve
|
||||
make serve
|
||||
|
||||
(Again for `npm` users, this is `npm start`.)
|
||||
|
||||
@@ -31,4 +31,4 @@ any time one of the source or spec files changes.
|
||||
|
||||
Tip for TDD'ers: to make PhantomJS run the test suite continuously each time
|
||||
a spec file is changed you can use nosier (http://pypi.python.org/pypi/nosier)
|
||||
and do `nosier -p test -p src "./build.py test"`.
|
||||
and do `nosier -p test -p src "make test"`.
|
||||
|
||||
@@ -22,7 +22,7 @@
|
||||
});
|
||||
</script>
|
||||
|
||||
<!-- This script is provided by the debug server (start with `build.py serve`) -->
|
||||
<!-- This script is provided by the debug server (started with `make serve`) -->
|
||||
<script type="text/javascript" src="loader.js"></script>
|
||||
|
||||
<script type="text/javascript">
|
||||
|
||||
@@ -7,10 +7,10 @@ Similar to the unit tests, there are two ways to run the tests: directly in the
|
||||
browser or using [SlimerJS](http://slimerjs.org/) from the command-line.
|
||||
|
||||
To run the tests in the browser, make sure the development server is running
|
||||
(`./build.py serve`) and open the URL
|
||||
(`make serve`) and open the URL
|
||||
[http://localhost:3000/test_rendering/index.html](http://localhost:3000/test_rendering/index.html).
|
||||
|
||||
From the command-line the tests can be run with the build target `./build.py test-rendering`.
|
||||
From the command-line the tests can be run with the build target `make test-rendering`.
|
||||
|
||||
## Adding new tests
|
||||
When creating a new test case, a reference image has to be created. By appending `?generate`
|
||||
|
||||
@@ -23,7 +23,7 @@
|
||||
});
|
||||
</script>
|
||||
|
||||
<!-- This script is provided by the debug server (start with `build.py serve`) -->
|
||||
<!-- This script is provided by the debug server (started with `make serve`) -->
|
||||
<script type="text/javascript" src="loader.js"></script>
|
||||
|
||||
<script type="text/javascript">
|
||||
|
||||
Reference in New Issue
Block a user