263 Commits

Author SHA1 Message Date
hugh-obrien
bfe31098cb send html 2018-07-09 14:37:54 +01:00
hugh-obrien
a1f9305047 hard coded a word doc 2018-07-09 14:02:39 +01:00
hugh-obrien
c66d76af2d make it do pdf 2018-07-09 13:52:18 +01:00
hugh-obrien
b8c82620e5 main file hack 2018-07-09 13:45:57 +01:00
hugh-obrien
3b905353d0 main file fix nonsence 2018-07-09 11:40:40 +01:00
hugh-obrien
af6a402a87 fix array nonsense 2018-07-09 11:37:57 +01:00
hugh-obrien
3c639959f4 hack in the command and image 2018-07-09 11:35:19 +01:00
Alberto Fernández-Capel
039d5e01ec Merge pull request #87 from sharelatex/afc-travis-nvmrc
Make travis read the node version from the .nvmrc file
2018-05-01 09:45:55 +01:00
Alberto Fernández Capel
1d38dd3a92 Make travis read the node version from the .nvmrc file
See https://docs.travis-ci.com/user/languages/javascript-with-nodejs/#Specifying-Node.js-versions-using-.nvmrc
2018-05-01 09:25:37 +01:00
James Allen
12c1dc632a Merge pull request #83 from sharelatex/ja-dockerize-dev
Provide hosts as settings and add npm run start script
2018-01-16 17:08:09 +00:00
James Allen
7a6294081d Allow texlive image user to be configured 2018-01-16 10:46:59 +00:00
Brian Gough
7d8a18c46c Merge pull request #82 from sharelatex/bg-log-core-files-as-error
log an error if core file is found in output
2018-01-04 09:22:44 +00:00
Brian Gough
a0d5e6a54b log an error if core file is found in output 2018-01-03 15:41:31 +00:00
James Allen
f58ef67875 Provide hosts and siblings container as environment settings and add npm run start script 2017-12-29 08:08:19 +00:00
Joe Green
6d42e18088 Add a 1 second delay to the smoke tests (#81)
* Add a 1 second delay to the smoke tests

Fixes a race condition where smoke tests exit before container can be attached to.

See here for more info: https://github.com/overleaf/sharelatex/issues/274

* give the smoke tests additional work to do

* escape slashes
2017-12-05 16:51:59 +00:00
Joe Green
ef0db41dae Merge pull request #80 from sharelatex/jg-smoketest-interval
Increase smoke test interval to 30 seconds
2017-11-29 15:34:49 +00:00
Joe Green
3692570df0 Increase smoke test interval to 30 seconds
The smoke tests can sometimes take ~20 seconds to complete, which causes the http POST to time out. This should solve that problem.
2017-11-29 11:01:51 +00:00
Brian Gough
8255997fad Merge pull request #79 from sharelatex/bg-fix-listen-in-acceptance-tests
exit if mock server fails to start
2017-10-25 09:13:33 +01:00
Brian Gough
360e8220ce exit if mock server fails to start 2017-10-20 15:16:35 +01:00
Joe Green
23f4f2175c Update Jenkinsfile 2017-10-16 14:13:51 +01:00
Joe Green
eb35cab72d only alert on master 2017-10-12 16:54:54 +01:00
Brian Gough
48b2548533 Merge pull request #78 from sharelatex/bg-fix-read-logging
fix read logging
2017-10-02 16:12:12 +01:00
Brian Gough
86cc30d8fa fix typo in log message 2017-10-02 15:45:09 +01:00
Brian Gough
60ad425205 move logging from SafeReader into caller
prevent unnecessary logging when looking at headers of files where
hitting the end of the file is expected.
2017-10-02 15:44:00 +01:00
Brian Gough
d63f339fc4 Merge pull request #77 from sharelatex/bg-fix-tikzexternalize-II
fix tikzexternalize ii
2017-10-02 11:19:06 +01:00
Brian Gough
1da918e13c simplify tikzexternalize checks 2017-09-29 17:00:53 +01:00
Brian Gough
d1aa1d84fb keep tikzexternalize files 2017-09-29 16:02:23 +01:00
Joe Green
88eafdf575 Update Jenkinsfile 2017-09-28 13:46:01 +01:00
Brian Gough
d8858cfadd Merge branch 'bg-lock-compiles' 2017-09-28 13:16:29 +01:00
Joe Green
fd0cbb2c52 use npm cache in CI build 2017-09-28 11:51:41 +01:00
Joe Green
bd5a0ef36f Jg jenkinsfile cleanup (#75)
* Update Jenkinsfile

make sure we don't ship unneeded build files

* Update ExampleDocumentTests.coffee

* use node 6.11.2 in jenkins file
2017-09-28 11:50:33 +01:00
Brian Gough
1388093866 Merge pull request #73 from sharelatex/bg-handle-dot-files-in-resource-list
handle dot files in resource list
2017-09-28 09:59:27 +01:00
Joe Green
c3e3e3d8ac Update Jenkinsfile 2017-09-26 11:44:48 +01:00
Brian Gough
23fec68111 use a separate function for hidden file check 2017-09-26 11:03:20 +01:00
Brian Gough
dbeff9a7b8 exclude hidden files from output
express static server doesn't serve them and rejects with 404
2017-09-26 10:42:59 +01:00
Brian Gough
f11468b595 remove stat test for missing files 2017-09-26 09:48:09 +01:00
Brian Gough
0930b1cd8f only exclude clsi-specific files from output list 2017-09-26 09:47:29 +01:00
Brian Gough
a36ec7f54e fix comment 2017-09-25 16:06:45 +01:00
Brian Gough
eaa99c7274 fix unit tests for use of fs-extra 2017-09-25 15:28:31 +01:00
Brian Gough
b0f879d652 lock compile directory 2017-09-22 16:19:33 +01:00
Brian Gough
8305268848 unit tests for ResourceStateManager 2017-09-15 13:42:57 +01:00
Brian Gough
aa5eeb0903 fallback check for missing files
dot files are not examined by OutputFileFinder, so do an extra check to
make sure those exist

also check for any relative paths in the resources
2017-09-15 13:41:56 +01:00
Brian Gough
2af05030f2 Merge pull request #71 from sharelatex/bg-merge-state-and-resource-list-files
merge state and resource list files
2017-09-11 08:54:30 +01:00
Joe Green
d04f93855b Add jenkinsfile (#72)
* create Jenkinsfile

* allow textlive image to be set with env vars

* log error message in test

* use sandboxed compiles variables

* Add SANDBOXED_COMPILES_HOST_DIR var to test config

* add SIBLING_CONTAINER_USER env var
2017-09-08 14:06:04 +01:00
Brian Gough
a2c97e6f9a rename saveProjectStateHash to saveProjectState 2017-09-08 13:56:40 +01:00
Brian Gough
acab9d45a0 log any missing files 2017-09-07 16:54:09 +01:00
Brian Gough
0fac2655f7 fix whitespace 2017-09-07 13:52:34 +01:00
Brian Gough
c1ca32184f log error if state file is truncacted 2017-09-07 13:52:34 +01:00
Brian Gough
97d7d76e61 combine the resource state and resource list
to prevent them getting out of sync
2017-09-07 13:52:34 +01:00
Shane Kilkelly
d865fda6a9 Merge pull request #70 from sharelatex/sk-node-6
Upgrade to node 6.11
2017-08-31 13:35:27 +01:00
Shane Kilkelly
3d053a2e34 Upgrade to node 6.9 2017-08-29 14:30:43 +01:00
Brian Gough
faa2a325cb added logging 2017-08-29 12:09:31 +01:00
James Allen
b42347ea08 Merge pull request #69 from sharelatex/as-update-docker-runner-config
Update docker-runner-sharelatex config
2017-08-24 15:17:16 +02:00
Alasdair Smith
d5b3101637 Update docker-runner-sharelatex config 2017-08-24 13:34:24 +01:00
Brian Gough
c1d1f93453 Merge pull request #66 from sharelatex/bg-compile-from-redis
Write files incrementally
2017-08-23 15:35:56 +01:00
Brian Gough
fc1782e74c read resource files safely
put a limit on the amount of data read
2017-08-18 11:17:01 +01:00
Brian Gough
6921cf25b8 splice state management into ResourceStateManager 2017-08-18 10:22:17 +01:00
Brian Gough
0b9ddb8efe fix whitespace 2017-08-18 09:41:59 +01:00
Brian Gough
e8064f12a1 finish unit test for incremental update 2017-08-18 09:41:43 +01:00
Brian Gough
e4aad90f33 ResourceWriter unit tests (wip) 2017-08-17 16:59:37 +01:00
Brian Gough
a8aaf58e64 test syncType in RequestParser 2017-08-17 15:57:05 +01:00
Brian Gough
5b5f7b0690 avoid adding draft mode more than once 2017-08-17 15:03:37 +01:00
Brian Gough
2b610030d5 store the resource list in a file 2017-08-17 14:53:35 +01:00
Brian Gough
00ddfdf42b fix unit tests 2017-08-09 15:22:44 +01:00
Brian Gough
c25e96bbc3 add comment about syncType/syncState 2017-08-09 15:22:38 +01:00
Henry Oswald
4eb8c107c9 Merge pull request #68 from sharelatex/ho-mkdir-cache-comiles
use grunt to make compiles and cache dirs
2017-08-09 11:07:36 +01:00
Brian Gough
86fa940c97 clean up the state file if no state passed in 2017-08-08 16:29:57 +01:00
Henry Oswald
7cd81ac3df use grunt to make compiles and cache dirs 2017-08-07 16:21:37 +01:00
Henry Oswald
fdc22c9cd2 Merge pull request #67 from sharelatex/revert-65-add-compiles-folder
Revert "Keep compiles and cache directories"
2017-08-07 15:29:30 +01:00
Henry Oswald
c3fe17d0b6 Revert "Keep compiles and cache directories" 2017-08-07 15:29:18 +01:00
Brian Gough
206adc2d04 fix broken unit tests 2017-08-07 15:00:16 +01:00
Brian Gough
6542ce20b6 fix incremental request 2017-08-07 14:32:28 +01:00
Brian Gough
b4be40d061 restrict syncType values to full/incremental 2017-08-07 10:19:56 +01:00
Brian Gough
11898b897e added files out of sync error object 2017-08-03 15:56:59 +01:00
Brian Gough
74c26120b2 use syncType and syncState for clsi state options 2017-08-03 12:00:32 +01:00
Brian Gough
7e1d3d98e7 write files incrementally 2017-08-02 13:46:10 +01:00
Henry Oswald
d5e0ab5a6f Merge pull request #65 from sharelatex/add-compiles-folder
Keep compiles and cache directories
2017-07-28 11:24:36 +01:00
Hayden Faulds
4c105e7826 keep cache directory 2017-07-27 15:54:20 +01:00
Hayden Faulds
cd5adaff51 keep compiles directory 2017-07-27 14:02:24 +01:00
Henry Oswald
e5081df2a9 Revert "change"
This reverts commit 104ce81ebd.
2017-07-23 22:45:04 +01:00
Henry Oswald
104ce81ebd change 2017-07-23 22:42:07 +01:00
Brian Gough
08fd440df5 Merge pull request #63 from sharelatex/bg-fix-tikzmanager-exception
fix tikzmanager exception
2017-07-20 13:22:58 +01:00
Brian Gough
11cd569ed9 stub out unwanted dependency in unit tests 2017-07-18 11:30:22 +01:00
Brian Gough
472531f617 fix exception for empty content in TikzManager 2017-07-18 11:29:59 +01:00
Brian Gough
ea34a1a89d update acceptance test images for texlive 2017 2017-07-13 13:15:51 +01:00
Brian Gough
2e91f07014 update acceptance tests settings to 2017 image 2017-07-12 16:59:33 +01:00
Shane Kilkelly
6f322583f7 Merge branch 'sk-reduce-kill-project-errors' 2017-06-27 10:03:51 +01:00
Shane Kilkelly
a74f4ac1a6 Send a 404 if the project files have gone away when running synctex.
This is semantically nicer than the 500 response which used to be
produced in these circumstances.
2017-06-23 14:46:40 +01:00
Shane Kilkelly
aa1dd2bf05 Killing an already stopped project is not an error
Log a warning instead and continue.
2017-06-20 09:18:15 +01:00
Shane Kilkelly
8e2584bab4 Mock out logger in tests 2017-06-20 08:25:50 +01:00
Brian Gough
f8530da626 Merge pull request #60 from sharelatex/bg-delete-xdv-files
delete intermediate xdv files from xelatex
2017-06-16 09:13:43 +01:00
Brian Gough
2edc015663 delete intermediate xdv files from xelatex 2017-06-15 15:37:45 +01:00
Brian Gough
f94e9989ec Merge pull request #58 from sharelatex/bg-check-dir-before-synctex
check file exists before running synctex
2017-05-31 10:16:06 +01:00
Brian Gough
c62f8b4854 check directory exists and bail out on error 2017-05-31 10:06:27 +01:00
Brian Gough
2d389130cc Merge pull request #59 from sharelatex/bg-reduce-clsi-error-reporting
don't report compile timeouts to sentry
2017-05-30 15:39:04 +01:00
Brian Gough
aafa691119 check file exists before running synctex 2017-05-24 10:09:43 +01:00
Brian Gough
a98b2b8032 don't report compile timeouts to sentry
just log them instead
2017-05-24 09:42:05 +01:00
Brian Gough
398ba5ae34 Merge pull request #56 from sharelatex/bg-disable-qpdf-setting
add setting to avoid optimisations outside docker
2017-04-11 14:16:19 +01:00
Brian Gough
a1613eac5a add setting to avoid optimisations outside docker 2017-04-10 16:12:03 +01:00
Brian Gough
3526fde665 Merge pull request #55 from sharelatex/bg-check-pdf-output-is-optimised
use pdfinfo on output to ensure pdfs are optimised
2017-04-10 15:06:22 +01:00
Brian Gough
e1b44beb3f use pdfinfo on output to ensure pdfs are optimised
needed to check that qpdf runs correctly inside the docker container
2017-04-07 11:11:27 +01:00
Brian Gough
17b16dadcd Merge pull request #54 from sharelatex/bg-avoid-running-qpdf-on-already-optimised-files
check if file is optimised before running qpdf
2017-04-05 13:18:32 +01:00
Brian Gough
eb1364f249 check if file is optimised before running qpdf 2017-04-04 16:50:06 +01:00
Shane Kilkelly
834ad57312 Add a .nvmrc file 2017-03-27 14:47:48 +01:00
Brian Gough
19dfaa7d55 Merge pull request #53 from sharelatex/bg-sanitise-paths
additional check for valid rootResource
2017-03-21 13:39:27 +00:00
Brian Gough
b529b8add3 Merge pull request #52 from sharelatex/bg-tikz-externalize
support for tikz externalize
2017-03-21 13:39:14 +00:00
Brian Gough
7ccc9500ed check for \tikzexternalize directly
instead of \usepackage{tikz} and \usepackage{pgf}
2017-03-21 11:36:08 +00:00
Brian Gough
750576d1b0 fix path match 2017-03-21 11:30:32 +00:00
Brian Gough
021d848819 create separate function for path checking 2017-03-21 11:29:37 +00:00
Brian Gough
8803762081 support for tikz externalize
make copy of main file as output.tex for tikz externalize
2017-03-20 10:55:28 +00:00
Brian Gough
5af137f60b additional check for valid rootResource 2017-03-20 10:03:48 +00:00
Brian Gough
f059948e27 update xelatex acceptance test pdf 2017-03-08 11:49:21 +00:00
Brian Gough
7a7c2ee992 improve debugging of failed acceptance tests
use the example name in the output filename
2017-03-08 11:49:12 +00:00
Brian Gough
efe5e22b4c include otf extension in fontawesome test 2017-03-08 11:25:25 +00:00
Shane Kilkelly
03d1936fde Upgrade logger 2017-03-06 14:56:32 +00:00
Shane Kilkelly
a0969ec839 Don't compile acceptance test files during test run 2017-03-06 14:43:14 +00:00
Brian Gough
fdab7763a2 Merge pull request #51 from sharelatex/bg-fix-latexmk-args
allow latexmk to pass through options
2017-03-03 13:19:23 +00:00
Brian Gough
57a5cfa9cb allow latexmk to pass through options
this avoids problems in the latest version of latexmk where the
$pdflatex variable has been replaced by $xelatex and $lualatex when
running with -xelatex or -lualatex
2017-03-02 16:43:35 +00:00
Joe Green
bfb27e6c25 Merge pull request #50 from sharelatex/ho-remove-tcp
remove tcp code, moved to agent load balancer
2017-02-23 14:42:54 +00:00
Henry Oswald
d4d3048719 remove tcp code, moved to agent load balancer 2017-02-23 11:09:18 +00:00
Brian Gough
29594fd0f7 fix acceptance test config file for latex prefix
latex command prefix was in wrong scope
2017-02-21 09:37:05 +00:00
Brian Gough
a50582fd7c add fontawesome acceptance test for xelatex 2017-02-21 09:37:05 +00:00
Henry Oswald
08f0955817 Merge pull request #49 from sharelatex/ho-one-cpu-size
if host has 1 cpu (staging) then set availableWorkingCpus to 1
2017-02-20 15:20:04 +00:00
Henry Oswald
bc1b8f4b2f Update app.coffee 2017-02-20 15:19:04 +00:00
Henry Oswald
599977c3e0 if host has 1 cpu (staging) then set availableWorkingCpus to 1 2017-02-20 15:16:52 +00:00
Brian Gough
071b2269b3 update acceptance tests for reversion to dvipdf 2017-02-13 13:42:44 +00:00
Brian Gough
fde8149579 fix #! in test script 2017-02-09 15:38:25 +00:00
Brian Gough
6b7e33bbc6 show debug info for acceptance tests 2017-02-09 14:17:38 +00:00
Brian Gough
2898a82de8 update acceptance test output for fontawesome 2017-02-07 11:51:21 +00:00
Brian Gough
5b71b849ca added fontawesome acceptance test 2017-02-07 10:00:41 +00:00
Brian Gough
6cb5926c21 fix lualatex require 2017-02-07 08:59:45 +00:00
Brian Gough
3cffb61c74 add luatex85 package to tikz feynman test 2017-02-07 08:49:19 +00:00
Brian Gough
5705455ce1 added acceptance test for tikz-feynman 2017-02-07 08:12:47 +00:00
Brian Gough
71fb15e0ee update knitr_utf acceptance test output
needs to include table of contents from multiple latexmk runs
2017-02-06 16:27:47 +00:00
Brian Gough
819c642b8d add knitr utf8 acceptance test 2017-02-03 15:38:06 +00:00
Brian Gough
20cb52793d add acceptance test for hebrew 2017-02-03 15:16:47 +00:00
Brian Gough
e507bd6394 update acceptance test image for lualatex
small pixel-level change in output
2017-01-31 16:04:59 +00:00
Brian Gough
444b3586a7 increase debugging in acceptance tests 2017-01-31 10:47:49 +00:00
Brian Gough
e25ebd296e add debugging to acceptance tests 2017-01-31 10:40:05 +00:00
Brian Gough
5090ad5c41 update feymp test image
minor pixel change in position of labels in texlive 2016
2017-01-31 10:21:00 +00:00
Brian Gough
bc73f719b2 update asymptote pdf to a4 size for texlive 2016 2017-01-31 09:53:36 +00:00
Brian Gough
d238f73e29 try output.pdf generated with texlive 2016 2017-01-30 15:37:26 +00:00
Brian Gough
ea484da9f4 update latex_compiler test pdf 2017-01-27 12:32:14 +00:00
Brian Gough
b76a81e98b specify papersize explicitly in latex test 2017-01-27 12:21:57 +00:00
Brian Gough
f00be9018d log acceptance test server output to file 2017-01-26 12:20:41 +00:00
Brian Gough
146138f65c try running user as jenkins 2017-01-26 12:06:38 +00:00
Brian Gough
654a43655f update image for docker tests 2017-01-25 14:12:19 +00:00
Brian Gough
b9d6db6caf use local docker image for clsi test 2017-01-25 14:09:44 +00:00
Brian Gough
03e837c1f4 run tests outside container, add settings file 2017-01-25 14:08:39 +00:00
Brian Gough
420db18a03 upgrade to latest sqlite3 2017-01-24 16:06:32 +00:00
Brian Gough
dab92967c8 added docker script for acceptance tests 2017-01-24 12:18:30 +00:00
Brian Gough
0530e21246 fix acceptance tests 2017-01-24 11:07:54 +00:00
Brian Gough
9e53c0b99e fix exception in error log 2016-10-14 10:23:13 +01:00
Shane Kilkelly
61089eca40 Increase memory limit to 64mb 2016-09-28 11:02:58 +01:00
Shane Kilkelly
4827aec30b Add test for new ulimit options 2016-09-23 15:34:29 +01:00
Shane Kilkelly
0900340282 Add CHKTEX_ULIMIT_OPTIONS 2016-09-23 15:32:37 +01:00
James Allen
f7b4883397 Don't delete knitr cache files 2016-09-22 14:14:29 +01:00
James Allen
79b3d2172b Sanitize resource path along with rootResourcePath 2016-09-21 15:09:01 +01:00
Brian Gough
9f49dc8554 Merge pull request #45 from sharelatex/fix-chktex-for-knitr
only run chktex on .tex files, not .Rtex files
2016-09-12 16:36:59 +01:00
Brian Gough
ee170b4e67 only run chktex on .tex files, not .Rtex files
the .tex files produced from knitr have macros which confuse chktex
2016-09-12 16:29:36 +01:00
Shane Kilkelly
47105190be Revert "Revert "Revert "Upgrade to node 4.2"""
This reverts commit 98fb2cab99.
2016-09-01 12:47:13 +01:00
Shane Kilkelly
98fb2cab99 Revert "Revert "Upgrade to node 4.2""
This reverts commit 4128dc6fdd.
2016-09-01 11:22:11 +01:00
Shane Kilkelly
4128dc6fdd Revert "Upgrade to node 4.2"
This reverts commit 8bb12f4d99.
2016-09-01 09:53:12 +01:00
Shane Kilkelly
4a2b2a8707 Merge branch 'master' into sk-node-upgrade 2016-08-31 16:34:25 +01:00
Brian Gough
095e16e953 handle failed compile due to validation error 2016-08-24 15:46:47 +01:00
Brian Gough
3a73971b42 fix commandRunner error to match dockerRunner 2016-08-24 15:45:26 +01:00
Brian Gough
748caeee7d remove chktex error
too many false positives from 'unable to execute latex command'
2016-08-22 15:11:39 +01:00
Brian Gough
cd7ed6ce66 update tests 2016-08-11 10:31:37 +01:00
Brian Gough
2200ac2cf2 capture texcount error output 2016-08-11 10:26:08 +01:00
Brian Gough
928ffc96e6 read wordcount output asynchronously 2016-08-11 09:32:53 +01:00
Brian Gough
ade3da7e0d add missing argument parameter to wordcount call 2016-08-11 09:29:03 +01:00
Brian Gough
e66b1ecdea use a command wrapper for synctex
instead of an alternative child_process object
2016-08-04 16:08:14 +01:00
Brian Gough
c6744caeeb change logging message to be different from LatexRunner 2016-08-04 16:07:36 +01:00
Brian Gough
189648e39a Merge pull request #44 from sharelatex/add-chktex-support
Add chktex support
2016-08-02 14:55:38 +01:00
Brian Gough
8da29e6024 provide setting to override child_process.execFile for synctex 2016-07-29 14:54:24 +01:00
Brian Gough
664e908378 provide validation mode where compilation always exits after chktex 2016-07-27 16:54:27 +01:00
Brian Gough
14837a57ec run chktex when request has check:true 2016-07-26 16:22:38 +01:00
Brian Gough
6524439699 add support for passing additional environment parameters to command runner
includes an example of passing environment variables to chktex
2016-07-26 12:30:29 +01:00
Brian Gough
a7c7f2697f Merge pull request #43 from sharelatex/stop-compile
add support for stopping compile
2016-07-18 11:16:53 +01:00
Brian Gough
fdf274fb82 remove dead code 2016-07-18 11:05:45 +01:00
Brian Gough
69666bef60 add support for stopping compile 2016-07-14 16:43:52 +01:00
Henry Oswald
cd8e60195c Merge pull request #42 from WaeCo/patch-1
Set default project_cache_length_ms to 1 day
2016-07-13 21:32:02 +01:00
WaeCo
d6808c11cc Set default project_cache_length_ms to 1 day
`project_cache_length_ms` was only `60*60*24 = 1.5 min` which is a little bit short. Default of one day seams more reasonable.
2016-07-13 13:26:32 -07:00
Brian Gough
133f522e7b Merge pull request #41 from sharelatex/per-user-containers-part-3
Reduce number of cached builds for per-user containers
2016-06-30 08:06:01 +01:00
Brian Gough
d29416fc77 keep one extra build until per-page pdf serving is enabled 2016-06-29 16:31:16 +01:00
Brian Gough
c486d6c215 only keep a single cached output directory in per-user containers 2016-06-28 09:28:40 +01:00
Shane Kilkelly
8bb12f4d99 Upgrade to node 4.2 2016-06-20 09:31:30 +01:00
Shane Kilkelly
e4ffc94de8 Move the latexmk timing command into a configurable latexmkCommandPrefix.
By default, no timing information will be taken.
On Linux with GNU user land, this value should be configured to `["/usr/bin/time", "-v"]`.
On Mac, gnu-time should be installed and configured to `["/usr/local/bin/gtime", "-v"]`.
2016-06-17 14:38:08 +01:00
Brian Gough
0b8435e358 add route to serve files from top level of per user containers 2016-06-15 16:12:19 +01:00
Brian Gough
801f09e7ed Merge branch 'per-user-containers-part-2'
Conflicts:
	app/coffee/CompileController.coffee
2016-06-13 09:33:41 +01:00
Brian Gough
603b3d617c Merge pull request #39 from sharelatex/per-user-containers-part-1
Per user containers part 1
2016-06-09 15:17:35 +01:00
Henry Oswald
b97627d6d8 use process id so link process to smoke test 2016-06-07 14:47:51 +01:00
Henry Oswald
da02661d53 add random string to smoke tests to avoid collision 2016-06-07 14:39:01 +01:00
Brian Gough
6e017ecaf1 log user_id when clearing project 2016-06-02 15:32:33 +01:00
Brian Gough
0887fe3a72 add per-user routes for clearing cache and extend expiry methods
this adds separate functionality for clearing the cache (assets and
database) and the project compile directory for a specific user
2016-06-02 15:32:33 +01:00
Brian Gough
226e6c87b1 add per-user routes and methods 2016-06-02 15:32:31 +01:00
Brian Gough
8c42a353e1 put the build id in the output file urls
the url attribute will now give the preferred location for accessing
the output file, without the url having to be constructed by the web
client
2016-06-02 15:30:50 +01:00
Brian Gough
78b88683fc put the build id in the output file urls
the url attribute will now give the preferred location for accessing
the output file, without the url having to be constructed by the web
client
2016-06-02 15:29:56 +01:00
Henry Oswald
ac3b7a571a log out error on synctex 2016-05-27 16:18:18 +01:00
Henry Oswald
cda1e301f6 log out errors more clearly 2016-05-27 14:45:39 +01:00
Henry Oswald
da324a8dd0 added logger.info to test setup 2016-05-24 14:12:02 +01:00
Henry Oswald
b2f687c061 log out which command logger is used 2016-05-24 14:08:39 +01:00
Henry Oswald
2c3b1126b0 log out if the command running is being used 2016-05-23 15:45:39 +01:00
Henry Oswald
22f730c3e9 parallelFileDownloads defaults to 1, sql can't take it 2016-05-23 14:31:27 +01:00
Henry Oswald
2e97bcba3a add error handler on CommandRunner 2016-05-23 14:13:55 +01:00
Brian Gough
0da85d5d03 be ready to serve files from per-user containers 2016-05-20 10:23:07 +01:00
Brian Gough
3379577499 fix error in log for expiry timeout 2016-05-20 10:23:07 +01:00
Henry Oswald
855169b571 Merge branch 'master' of https://github.com/sharelatex/clsi-sharelatex 2016-05-19 16:57:19 +01:00
Henry Oswald
6b107bd20a log out EXPIRY_TIMEOUT 2016-05-19 16:57:14 +01:00
Henry Oswald
a2c2fc3a51 make cached assets ttl set via config 2016-05-19 16:51:50 +01:00
Brian Gough
f8ae215c1e avoid clobbering the existing port variable 2016-05-19 16:38:18 +01:00
Brian Gough
d26c6b933e return the file path in the output file list for easy lookup 2016-05-19 16:38:18 +01:00
Brian Gough
4496ddddfd Merge pull request #38 from sharelatex/add-fast-path-to-pdf
Add fast path to pdf
2016-05-13 12:32:26 +01:00
Brian Gough
434e00cb74 make the build id a secure random token
we allow existing build ids to work for backwards compatibility
this can be removed after some time
2016-05-13 10:11:35 +01:00
Brian Gough
f92c70935b allow direct path to output file /project/project_id/build/build_id/output/*
this avoids use of the query string ?build=... and so we can match the
url directly with the nginx location directive
2016-05-13 10:10:48 +01:00
Brian Gough
51f87c5f79 fix logic excluding smoke test in metric 2016-05-10 10:10:01 +01:00
Brian Gough
143913c67f fix tagname for graphite 2016-05-10 09:41:39 +01:00
Brian Gough
dfd2bc31ef record system time 2016-05-10 09:12:13 +01:00
Brian Gough
e70bd3ae8e preserve existing metric name 2016-05-10 09:12:00 +01:00
Brian Gough
0a5ca6b0fa add timing information from /usr/bin/time 2016-05-09 16:00:24 +01:00
Brian Gough
834668b033 add a metric for the TeXLive image used on each compile 2016-05-09 15:36:11 +01:00
Henry Oswald
35240fbd4d move back to 2.5 days cache for moment 2016-04-21 17:40:09 +01:00
Henry Oswald
5f7cd5ece5 added project status endpoint
used for getting the server a project is on
2016-04-20 15:38:05 +01:00
Henry Oswald
6860d2be6c increased clsi cache to 3.5 days 2016-04-13 09:29:57 +01:00
Henry Oswald
3c021fd4c9 ignore ECONNRESET 2016-04-12 13:32:58 +01:00
Henry Oswald
f453f954e4 use socket.end for tcp checks 2016-04-12 10:49:45 +01:00
Henry Oswald
cd499fa4e5 server load endpoint uses settings for port 2016-04-11 13:47:06 +01:00
Henry Oswald
7799e0bfdd return 0 for server which is being hammered
socket.destroy when finished
2016-04-08 15:40:02 +01:00
Henry Oswald
6ca8c10734 added err handler to socket 2016-04-08 15:25:00 +01:00
Henry Oswald
84cba7365f work of 1 min load and set server as up 2016-04-08 15:18:22 +01:00
Henry Oswald
11be12fc8e evaluate on every call 2016-04-08 14:14:05 +01:00
Henry Oswald
3e70c0f8e4 added example server load tcp server 2016-04-08 13:31:23 +01:00
Brian Gough
558e9ae22b don't log errors when files have disappeared from build directory 2016-04-07 16:16:39 +01:00
Brian Gough
83e373d7e1 log errors in detail when file cannot be removed 2016-04-04 16:22:48 +01:00
Brian Gough
24fc9391c3 upgrade to the latest version of request 2016-03-31 14:03:48 +01:00
Brian Gough
7ff56c4793 suppress error when removing nonexistent file from cache 2016-03-31 13:33:42 +01:00
Brian Gough
665dbff75a parameter check on project_id 2016-03-31 12:12:25 +01:00
Brian Gough
5d6fb4579a remove console.log 2016-03-31 11:59:17 +01:00
Brian Gough
bd036534e5 check directory exists before attempting to clear it 2016-03-31 11:59:17 +01:00
Brian Gough
3dcd4af62e always create project directory when syncing resources to disk
avoids errors when project is empty
2016-03-31 11:59:17 +01:00
Brian Gough
fe46a96fd2 don't log missing files as warnings, but do report file access errors 2016-03-31 11:14:39 +01:00
Brian Gough
8fcbec5c0f add support for sentry 2016-03-30 14:35:47 +01:00
James Allen
fbb00ebf2f Only archive main log and blg 2016-03-30 14:10:07 +01:00
James Allen
6117cac1fd Ignore both .cache and .archive and other hidden files in finding output files 2016-03-30 11:41:11 +01:00
James Allen
d949d4ac32 Don't timestamp strace logs otherwise it runs as anew container each time since the command changes 2016-03-30 10:59:01 +01:00
James Allen
6af22cf184 Add in flags to run strace and capture logs 2016-03-30 10:37:22 +01:00
Brian Gough
9f104a4f57 bugfix - avoid double counting compiles 2016-03-17 14:37:34 +00:00
Brian Gough
595bfe09ac add metric for qpdf 2016-03-17 09:55:18 +00:00
Brian Gough
e64b08fcbe add metrics for latexmk runs and errors 2016-03-17 09:55:18 +00:00
Henry Oswald
dcfe1118d4 increased EXPIRY_TIMEOUT from 1.5 days to 2.5 days 2016-03-10 10:30:37 +00:00
James Allen
89acd36dde Send .svg files as text/plain to prevent executable JS if they are loaded as SVG in the browser 2016-03-10 09:32:32 +00:00
James Allen
a3383f11a1 Make draft mode regex global 2016-02-02 15:28:59 +00:00
James Allen
2df886e330 Remove left over debug log line 2016-02-02 14:28:51 +00:00
James Allen
d96605d5e8 Inject [draft] option to documentclass if draft option is passed 2016-02-02 14:26:14 +00:00
James Allen
03b75b12cf Download up to 5 files in parallel 2016-02-01 13:19:16 +00:00
James Allen
86cf05c732 Support configurable images in wordcount end point 2016-01-19 14:12:41 +00:00
James Allen
4497352a3a Allow optional image name to be passed 2016-01-15 09:59:06 +00:00
Henry Oswald
601a3e4805 Merge branch 'master' of https://github.com/sharelatex/clsi-sharelatex 2015-12-15 19:34:34 +00:00
Henry Oswald
0ea28710f5 fixed missing value in logger 2015-12-15 19:33:37 +00:00
James Allen
2b5e7be964 Remove undefined reference to dst 2015-12-03 14:54:48 +00:00
Henry Oswald
c178458223 added try catch around word count where a file is not created 2015-11-12 15:19:22 +00:00
Henry Oswald
3ed29b3489 increased cache time to 1.5 days 2015-10-21 10:02:30 +01:00
Shane Kilkelly
29be2dc700 When serving output files, intelligently determine the appropriate content-type.
cherry pick 6fa3fda3ed28239cf3ac9720629f9707663aa197 from datajoy.
2015-09-21 16:59:35 +01:00
63 changed files with 2237 additions and 279 deletions

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
6.11.2

View File

@@ -1,8 +1,5 @@
language: node_js
node_js:
- "0.10"
before_install:
- npm install -g grunt-cli

View File

@@ -46,6 +46,11 @@ module.exports = (grunt) ->
app:
src: "app.js"
mkdir:
all:
options:
create: ["cache", "compiles"]
mochaTest:
unit:
options:
@@ -70,6 +75,7 @@ module.exports = (grunt) ->
grunt.loadNpmTasks 'grunt-shell'
grunt.loadNpmTasks 'grunt-execute'
grunt.loadNpmTasks 'grunt-bunyan'
grunt.loadNpmTasks 'grunt-mkdir'
grunt.registerTask 'compile:bin', () ->
callback = @async()
@@ -93,6 +99,6 @@ module.exports = (grunt) ->
grunt.registerTask 'install', 'compile:app'
grunt.registerTask 'default', ['run']
grunt.registerTask 'default', ['mkdir', 'run']

101
Jenkinsfile vendored Normal file
View File

@@ -0,0 +1,101 @@
pipeline {
agent any
triggers {
pollSCM('* * * * *')
cron('@daily')
}
stages {
stage('Clean') {
steps {
// This is a terrible hack to set the file ownership to jenkins:jenkins so we can cleanup the directory
sh 'docker run -v $(pwd):/app --rm busybox /bin/chown -R 111:119 /app'
sh 'rm -fr node_modules'
}
}
stage('Install') {
agent {
docker {
image 'node:6.11.2'
args "-v /var/lib/jenkins/.npm:/tmp/.npm -e HOME=/tmp"
reuseNode true
}
}
steps {
sh 'git config --global core.logallrefupdates false'
sh 'rm -fr node_modules'
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: '_docker-runner'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/docker-runner-sharelatex']]])
sh 'npm install ./_docker-runner'
sh 'rm -fr ./_docker-runner ./_docker-runner@tmp'
sh 'npm install'
sh 'npm rebuild'
sh 'npm install --quiet grunt-cli'
}
}
stage('Compile and Test') {
agent {
docker {
image 'node:6.11.2'
reuseNode true
}
}
steps {
sh 'node_modules/.bin/grunt compile:app'
sh 'node_modules/.bin/grunt compile:acceptance_tests'
sh 'NODE_ENV=development node_modules/.bin/grunt test:unit'
}
}
stage('Acceptance Tests') {
environment {
TEXLIVE_IMAGE="quay.io/sharelatex/texlive-full:2017.1"
}
steps {
sh 'mkdir -p compiles cache'
// Not yet running, due to volumes/sibling containers
sh 'docker container prune -f'
sh 'docker pull $TEXLIVE_IMAGE'
sh 'docker pull sharelatex/acceptance-test-runner:clsi-6.11.2'
sh 'docker run --rm -e SIBLING_CONTAINER_USER=root -e SANDBOXED_COMPILES_HOST_DIR=$(pwd)/compiles -e SANDBOXED_COMPILES_SIBLING_CONTAINERS=true -e TEXLIVE_IMAGE=$TEXLIVE_IMAGE -v /var/run/docker.sock:/var/run/docker.sock -v $(pwd):/app sharelatex/acceptance-test-runner:clsi-6.11.2'
// This is a terrible hack to set the file ownership to jenkins:jenkins so we can cleanup the directory
sh 'docker run -v $(pwd):/app --rm busybox /bin/chown -R 111:119 /app'
sh 'rm -r compiles cache server.log db.sqlite config/settings.defaults.coffee'
}
}
stage('Package') {
steps {
sh 'echo ${BUILD_NUMBER} > build_number.txt'
sh 'touch build.tar.gz' // Avoid tar warning about files changing during read
sh 'tar -czf build.tar.gz --exclude=build.tar.gz --exclude-vcs .'
}
}
stage('Publish') {
steps {
withAWS(credentials:'S3_CI_BUILDS_AWS_KEYS', region:"${S3_REGION_BUILD_ARTEFACTS}") {
s3Upload(file:'build.tar.gz', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/${BUILD_NUMBER}.tar.gz")
// The deployment process uses this file to figure out the latest build
s3Upload(file:'build_number.txt', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/latest")
}
}
}
}
post {
failure {
mail(from: "${EMAIL_ALERT_FROM}",
to: "${EMAIL_ALERT_TO}",
subject: "Jenkins build failed: ${JOB_NAME}:${BUILD_NUMBER}",
body: "Build: ${BUILD_URL}")
}
}
// The options directive is for configuration that applies to the whole job.
options {
// we'd like to make sure remove old builds, so we don't fill up our storage!
buildDiscarder(logRotator(numToKeepStr:'50'))
// And we'd really like to be sure that this build doesn't hang forever, so let's time it out after:
timeout(time: 30, unit: 'MINUTES')
}
}

View File

@@ -2,7 +2,12 @@ CompileController = require "./app/js/CompileController"
Settings = require "settings-sharelatex"
logger = require "logger-sharelatex"
logger.initialize("clsi")
if Settings.sentry?.dsn?
logger.initializeErrorReporting(Settings.sentry.dsn)
smokeTest = require "smoke-test-sharelatex"
ContentTypeMapper = require "./app/js/ContentTypeMapper"
Errors = require './app/js/Errors'
Path = require "path"
fs = require "fs"
@@ -32,12 +37,42 @@ app.use (req, res, next) ->
res.setTimeout TIMEOUT
next()
app.param 'project_id', (req, res, next, project_id) ->
if project_id?.match /^[a-zA-Z0-9_-]+$/
next()
else
next new Error("invalid project id")
app.param 'user_id', (req, res, next, user_id) ->
if user_id?.match /^[0-9a-f]{24}$/
next()
else
next new Error("invalid user id")
app.param 'build_id', (req, res, next, build_id) ->
if build_id?.match OutputCacheManager.BUILD_REGEX
next()
else
next new Error("invalid build id #{build_id}")
app.post "/project/:project_id/compile", bodyParser.json(limit: "5mb"), CompileController.compile
app.post "/project/:project_id/compile/stop", CompileController.stopCompile
app.delete "/project/:project_id", CompileController.clearCache
app.get "/project/:project_id/sync/code", CompileController.syncFromCode
app.get "/project/:project_id/sync/pdf", CompileController.syncFromPdf
app.get "/project/:project_id/wordcount", CompileController.wordcount
app.get "/project/:project_id/status", CompileController.status
# Per-user containers
app.post "/project/:project_id/user/:user_id/compile", bodyParser.json(limit: "5mb"), CompileController.compile
app.post "/project/:project_id/user/:user_id/compile/stop", CompileController.stopCompile
app.delete "/project/:project_id/user/:user_id", CompileController.clearCache
app.get "/project/:project_id/user/:user_id/sync/code", CompileController.syncFromCode
app.get "/project/:project_id/user/:user_id/sync/pdf", CompileController.syncFromPdf
app.get "/project/:project_id/user/:user_id/wordcount", CompileController.wordcount
ForbidSymlinks = require "./app/js/StaticServerForbidSymlinks"
@@ -46,17 +81,28 @@ ForbidSymlinks = require "./app/js/StaticServerForbidSymlinks"
# and serving the files
staticServer = ForbidSymlinks express.static, Settings.path.compilesDir, setHeaders: (res, path, stat) ->
if Path.basename(path) == "output.pdf"
res.set("Content-Type", "application/pdf")
# Calculate an etag in the same way as nginx
# https://github.com/tj/send/issues/65
etag = (path, stat) ->
'"' + Math.ceil(+stat.mtime / 1000).toString(16) +
'-' + Number(stat.size).toString(16) + '"'
res.set("Etag", etag(path, stat))
else
# Force plain treatment of other file types to prevent hosting of HTTP/JS files
# that could be used in same-origin/XSS attacks.
res.set("Content-Type", "text/plain")
res.set("Content-Type", ContentTypeMapper.map(path))
app.get "/project/:project_id/user/:user_id/build/:build_id/output/*", (req, res, next) ->
# for specific build get the path from the OutputCacheManager (e.g. .clsi/buildId)
req.url = "/#{req.params.project_id}-#{req.params.user_id}/" + OutputCacheManager.path(req.params.build_id, "/#{req.params[0]}")
staticServer(req, res, next)
app.get "/project/:project_id/build/:build_id/output/*", (req, res, next) ->
# for specific build get the path from the OutputCacheManager (e.g. .clsi/buildId)
req.url = "/#{req.params.project_id}/" + OutputCacheManager.path(req.params.build_id, "/#{req.params[0]}")
staticServer(req, res, next)
app.get "/project/:project_id/user/:user_id/output/*", (req, res, next) ->
# for specific user get the path to the top level file
req.url = "/#{req.params.project_id}-#{req.params.user_id}/#{req.params[0]}"
staticServer(req, res, next)
app.get "/project/:project_id/output/*", (req, res, next) ->
if req.query?.build? && req.query.build.match(OutputCacheManager.BUILD_REGEX)
@@ -66,6 +112,11 @@ app.get "/project/:project_id/output/*", (req, res, next) ->
req.url = "/#{req.params.project_id}/#{req.params[0]}"
staticServer(req, res, next)
app.get "/oops", (req, res, next) ->
logger.error {err: "hello"}, "test error"
res.send "error\n"
app.get "/status", (req, res, next) ->
res.send "CLSI is alive\n"
@@ -82,7 +133,7 @@ if Settings.smokeTest
do runSmokeTest = ->
logger.log("running smoke tests")
smokeTest.run(require.resolve(__dirname + "/test/smoke/js/SmokeTests.js"))({}, resCacher)
setTimeout(runSmokeTest, 20 * 1000)
setTimeout(runSmokeTest, 30 * 1000)
app.get "/health_check", (req, res)->
res.contentType(resCacher?.setContentType)
@@ -102,8 +153,12 @@ app.get "/heapdump", (req, res)->
res.send filename
app.use (error, req, res, next) ->
logger.error err: error, "server error"
res.sendStatus(error?.statusCode || 500)
if error instanceof Errors.NotFoundError
logger.warn {err: error, url: req.url}, "not found error"
return res.sendStatus(404)
else
logger.error {err: error, url: req.url}, "server error"
res.sendStatus(error?.statusCode || 500)
app.listen port = (Settings.internal?.clsi?.port or 3013), host = (Settings.internal?.clsi?.host or "localhost"), (error) ->
logger.info "CLSI starting up, listening on #{host}:#{port}"
@@ -111,3 +166,4 @@ app.listen port = (Settings.internal?.clsi?.port or 3013), host = (Settings.inte
setInterval () ->
ProjectPersistenceManager.clearExpiredProjects()
, tenMinutes = 10 * 60 * 1000

View File

@@ -1,12 +1,44 @@
spawn = require("child_process").spawn
logger = require "logger-sharelatex"
logger.info "using standard command runner"
module.exports = CommandRunner =
run: (project_id, command, directory, timeout, callback = (error) ->) ->
run: (project_id, command, directory, image, timeout, environment, callback = (error) ->) ->
command = (arg.replace('$COMPILE_DIR', directory) for arg in command)
logger.log project_id: project_id, command: command, directory: directory, "running command"
logger.warn "timeouts and sandboxing are not enabled with CommandRunner"
proc = spawn command[0], command.slice(1), stdio: "inherit", cwd: directory
proc.on "close", () ->
callback()
# merge environment settings
env = {}
env[key] = value for key, value of process.env
env[key] = value for key, value of environment
# run command as detached process so it has its own process group (which can be killed if needed)
proc = spawn command[0], command.slice(1), stdio: "inherit", cwd: directory, detached: true, env: env
proc.on "error", (err)->
logger.err err:err, project_id:project_id, command: command, directory: directory, "error running command"
callback(err)
proc.on "close", (code, signal) ->
logger.info code:code, signal:signal, project_id:project_id, "command exited"
if signal is 'SIGTERM' # signal from kill method below
err = new Error("terminated")
err.terminated = true
return callback(err)
else if code is 1 # exit status from chktex
err = new Error("exited")
err.code = code
return callback(err)
else
callback()
return proc.pid # return process id to allow job to be killed if necessary
kill: (pid, callback = (error) ->) ->
try
process.kill -pid # kill all processes in group
catch err
return callback(err)
callback()

View File

@@ -4,6 +4,7 @@ Settings = require "settings-sharelatex"
Metrics = require "./Metrics"
ProjectPersistenceManager = require "./ProjectPersistenceManager"
logger = require "logger-sharelatex"
Errors = require "./Errors"
module.exports = CompileController =
compile: (req, res, next = (error) ->) ->
@@ -11,35 +12,64 @@ module.exports = CompileController =
RequestParser.parse req.body, (error, request) ->
return next(error) if error?
request.project_id = req.params.project_id
request.user_id = req.params.user_id if req.params.user_id?
ProjectPersistenceManager.markProjectAsJustAccessed request.project_id, (error) ->
return next(error) if error?
CompileManager.doCompile request, (error, outputFiles = []) ->
if error?
logger.error err: error, project_id: request.project_id, "error running compile"
CompileManager.doCompileWithLock request, (error, outputFiles = []) ->
if error instanceof Errors.AlreadyCompilingError
code = 423 # Http 423 Locked
status = "compile-in-progress"
else if error instanceof Errors.FilesOutOfSyncError
code = 409 # Http 409 Conflict
status = "retry"
else if error?.terminated
status = "terminated"
else if error?.validate
status = "validation-#{error.validate}"
else if error?
if error.timedout
status = "timedout"
logger.log err: error, project_id: request.project_id, "timeout running compile"
else
status = "error"
code = 500
logger.error err: error, project_id: request.project_id, "error running compile"
else
status = "failure"
for file in outputFiles
if file.path?.match(/output\.pdf$/)
status = "success"
if file.path?.match(/output\.html$/)
status = "success"
# log an error if any core files are found
for file in outputFiles
if file.path is "core"
logger.error project_id:request.project_id, req:req, outputFiles:outputFiles, "core file found in output"
timer.done()
res.status(code or 200).send {
compile:
status: status
error: error?.message or error
error: error?.message or error
outputFiles: outputFiles.map (file) ->
url: "#{Settings.apis.clsi.url}/project/#{request.project_id}/output/#{file.path}"
url:
"#{Settings.apis.clsi.url}/project/#{request.project_id}" +
(if request.user_id? then "/user/#{request.user_id}" else "") +
(if file.build? then "/build/#{file.build}" else "") +
"/output/#{file.path}"
path: file.path
type: file.type
build: file.build
}
stopCompile: (req, res, next) ->
{project_id, user_id} = req.params
CompileManager.stopCompile project_id, user_id, (error) ->
return next(error) if error?
res.sendStatus(204)
clearCache: (req, res, next = (error) ->) ->
ProjectPersistenceManager.clearProject req.params.project_id, (error) ->
ProjectPersistenceManager.clearProject req.params.project_id, req.params.user_id, (error) ->
return next(error) if error?
res.sendStatus(204) # No content
@@ -48,8 +78,9 @@ module.exports = CompileController =
line = parseInt(req.query.line, 10)
column = parseInt(req.query.column, 10)
project_id = req.params.project_id
user_id = req.params.user_id
CompileManager.syncFromCode project_id, file, line, column, (error, pdfPositions) ->
CompileManager.syncFromCode project_id, user_id, file, line, column, (error, pdfPositions) ->
return next(error) if error?
res.send JSON.stringify {
pdf: pdfPositions
@@ -60,8 +91,9 @@ module.exports = CompileController =
h = parseFloat(req.query.h)
v = parseFloat(req.query.v)
project_id = req.params.project_id
user_id = req.params.user_id
CompileManager.syncFromPdf project_id, page, h, v, (error, codePositions) ->
CompileManager.syncFromPdf project_id, user_id, page, h, v, (error, codePositions) ->
return next(error) if error?
res.send JSON.stringify {
code: codePositions
@@ -70,9 +102,16 @@ module.exports = CompileController =
wordcount: (req, res, next = (error) ->) ->
file = req.query.file || "main.tex"
project_id = req.params.project_id
user_id = req.params.user_id
image = req.query.image
logger.log {image, file, project_id}, "word count request"
CompileManager.wordcount project_id, file, (error, result) ->
CompileManager.wordcount project_id, user_id, file, image, (error, result) ->
return next(error) if error?
res.send JSON.stringify {
texcount: result
}
status: (req, res, next = (error)-> )->
res.send("OK")

View File

@@ -7,82 +7,247 @@ Path = require "path"
logger = require "logger-sharelatex"
Metrics = require "./Metrics"
child_process = require "child_process"
CommandRunner = require(Settings.clsi?.commandRunner or "./CommandRunner")
DraftModeManager = require "./DraftModeManager"
TikzManager = require "./TikzManager"
LockManager = require "./LockManager"
fs = require("fs")
fse = require "fs-extra"
os = require("os")
async = require "async"
Errors = require './Errors'
commandRunner = Settings.clsi?.commandRunner or "./CommandRunner"
logger.info commandRunner:commandRunner, "selecting command runner for clsi"
CommandRunner = require(commandRunner)
getCompileName = (project_id, user_id) ->
if user_id? then "#{project_id}-#{user_id}" else project_id
getCompileDir = (project_id, user_id) ->
Path.join(Settings.path.compilesDir, getCompileName(project_id, user_id))
module.exports = CompileManager =
doCompileWithLock: (request, callback = (error, outputFiles) ->) ->
compileDir = getCompileDir(request.project_id, request.user_id)
lockFile = Path.join(compileDir, ".project-lock")
# use a .project-lock file in the compile directory to prevent
# simultaneous compiles
fse.ensureDir compileDir, (error) ->
return callback(error) if error?
LockManager.runWithLock lockFile, (releaseLock) ->
CompileManager.doCompile(request, releaseLock)
, callback
doCompile: (request, callback = (error, outputFiles) ->) ->
compileDir = Path.join(Settings.path.compilesDir, request.project_id)
compileDir = getCompileDir(request.project_id, request.user_id)
timer = new Metrics.Timer("write-to-disk")
logger.log project_id: request.project_id, "starting compile"
ResourceWriter.syncResourcesToDisk request.project_id, request.resources, compileDir, (error) ->
return callback(error) if error?
logger.log project_id: request.project_id, time_taken: Date.now() - timer.start, "written files to disk"
logger.log project_id: request.project_id, user_id: request.user_id, "syncing resources to disk"
ResourceWriter.syncResourcesToDisk request, compileDir, (error, resourceList) ->
# NOTE: resourceList is insecure, it should only be used to exclude files from the output list
if error? and error instanceof Errors.FilesOutOfSyncError
logger.warn project_id: request.project_id, user_id: request.user_id, "files out of sync, please retry"
return callback(error)
else if error?
logger.err err:error, project_id: request.project_id, user_id: request.user_id, "error writing resources to disk"
return callback(error)
logger.log project_id: request.project_id, user_id: request.user_id, time_taken: Date.now() - timer.start, "written files to disk"
timer.done()
timer = new Metrics.Timer("run-compile")
Metrics.inc("compiles")
LatexRunner.runLatex request.project_id, {
directory: compileDir
mainFile: request.rootResourcePath
compiler: request.compiler
timeout: request.timeout
}, (error) ->
return callback(error) if error?
logger.log project_id: request.project_id, time_taken: Date.now() - timer.start, "done compile"
timer.done()
injectDraftModeIfRequired = (callback) ->
if request.draft
DraftModeManager.injectDraftMode Path.join(compileDir, request.rootResourcePath), callback
else
callback()
OutputFileFinder.findOutputFiles request.resources, compileDir, (error, outputFiles) ->
createTikzFileIfRequired = (callback) ->
TikzManager.checkMainFile compileDir, request.rootResourcePath, resourceList, (error, usesTikzExternalize) ->
return callback(error) if error?
OutputCacheManager.saveOutputFiles outputFiles, compileDir, (error, newOutputFiles) ->
callback null, newOutputFiles
if usesTikzExternalize
TikzManager.injectOutputFile compileDir, request.rootResourcePath, callback
else
callback()
# set up environment variables for chktex
env = {}
# only run chktex on LaTeX files (not knitr .Rtex files or any others)
isLaTeXFile = request.rootResourcePath?.match(/\.tex$/i)
if request.check? and isLaTeXFile
env['CHKTEX_OPTIONS'] = '-nall -e9 -e10 -w15 -w16'
env['CHKTEX_ULIMIT_OPTIONS'] = '-t 5 -v 64000'
if request.check is 'error'
env['CHKTEX_EXIT_ON_ERROR'] = 1
if request.check is 'validate'
env['CHKTEX_VALIDATE'] = 1
# apply a series of file modifications/creations for draft mode and tikz
async.series [injectDraftModeIfRequired, createTikzFileIfRequired], (error) ->
return callback(error) if error?
timer = new Metrics.Timer("run-compile")
# find the image tag to log it as a metric, e.g. 2015.1 (convert . to - for graphite)
tag = request.imageName?.match(/:(.*)/)?[1]?.replace(/\./g,'-') or "default"
tag = "other" if not request.project_id.match(/^[0-9a-f]{24}$/) # exclude smoke test
Metrics.inc("compiles")
Metrics.inc("compiles-with-image.#{tag}")
compileName = getCompileName(request.project_id, request.user_id)
LatexRunner.runLatex compileName, {
directory: compileDir
mainFile: request.rootResourcePath
compiler: request.compiler
timeout: request.timeout
image: request.imageName
environment: env
}, (error, output, stats, timings) ->
# request was for validation only
if request.check is "validate"
result = if error?.code then "fail" else "pass"
error = new Error("validation")
error.validate = result
# request was for compile, and failed on validation
if request.check is "error" and error?.message is 'exited'
error = new Error("compilation")
error.validate = "fail"
# compile was killed by user, was a validation, or a compile which failed validation
if error?.terminated or error?.validate
OutputFileFinder.findOutputFiles resourceList, compileDir, (err, outputFiles) ->
return callback(err) if err?
callback(error, outputFiles) # return output files so user can check logs
return
# compile completed normally
return callback(error) if error?
Metrics.inc("compiles-succeeded")
for metric_key, metric_value of stats or {}
Metrics.count(metric_key, metric_value)
for metric_key, metric_value of timings or {}
Metrics.timing(metric_key, metric_value)
loadavg = os.loadavg?()
Metrics.gauge("load-avg", loadavg[0]) if loadavg?
ts = timer.done()
logger.log {project_id: request.project_id, user_id: request.user_id, time_taken: ts, stats:stats, timings:timings, loadavg:loadavg}, "done compile"
if stats?["latex-runs"] > 0
Metrics.timing("run-compile-per-pass", ts / stats["latex-runs"])
if stats?["latex-runs"] > 0 and timings?["cpu-time"] > 0
Metrics.timing("run-compile-cpu-time-per-pass", timings["cpu-time"] / stats["latex-runs"])
OutputFileFinder.findOutputFiles resourceList, compileDir, (error, outputFiles) ->
return callback(error) if error?
OutputCacheManager.saveOutputFiles outputFiles, compileDir, (error, newOutputFiles) ->
callback null, newOutputFiles
clearProject: (project_id, _callback = (error) ->) ->
stopCompile: (project_id, user_id, callback = (error) ->) ->
compileName = getCompileName(project_id, user_id)
LatexRunner.killLatex compileName, callback
clearProject: (project_id, user_id, _callback = (error) ->) ->
callback = (error) ->
_callback(error)
_callback = () ->
compileDir = Path.join(Settings.path.compilesDir, project_id)
proc = child_process.spawn "rm", ["-r", compileDir]
compileDir = getCompileDir(project_id, user_id)
proc.on "error", callback
CompileManager._checkDirectory compileDir, (err, exists) ->
return callback(err) if err?
return callback() if not exists # skip removal if no directory present
stderr = ""
proc.stderr.on "data", (chunk) -> stderr += chunk.toString()
proc = child_process.spawn "rm", ["-r", compileDir]
proc.on "close", (code) ->
if code == 0
return callback(null)
proc.on "error", callback
stderr = ""
proc.stderr.on "data", (chunk) -> stderr += chunk.toString()
proc.on "close", (code) ->
if code == 0
return callback(null)
else
return callback(new Error("rm -r #{compileDir} failed: #{stderr}"))
_findAllDirs: (callback = (error, allDirs) ->) ->
root = Settings.path.compilesDir
fs.readdir root, (err, files) ->
return callback(err) if err?
allDirs = (Path.join(root, file) for file in files)
callback(null, allDirs)
clearExpiredProjects: (max_cache_age_ms, callback = (error) ->) ->
now = Date.now()
# action for each directory
expireIfNeeded = (checkDir, cb) ->
fs.stat checkDir, (err, stats) ->
return cb() if err? # ignore errors checking directory
age = now - stats.mtime
hasExpired = (age > max_cache_age_ms)
if hasExpired then fse.remove(checkDir, cb) else cb()
# iterate over all project directories
CompileManager._findAllDirs (error, allDirs) ->
return callback() if error?
async.eachSeries allDirs, expireIfNeeded, callback
_checkDirectory: (compileDir, callback = (error, exists) ->) ->
fs.lstat compileDir, (err, stats) ->
if err?.code is 'ENOENT'
return callback(null, false) # directory does not exist
else if err?
logger.err {dir: compileDir, err:err}, "error on stat of project directory for removal"
return callback(err)
else if not stats?.isDirectory()
logger.err {dir: compileDir, stats:stats}, "bad project directory for removal"
return callback new Error("project directory is not directory")
else
return callback(new Error("rm -r #{compileDir} failed: #{stderr}"))
callback(null, true) # directory exists
syncFromCode: (project_id, file_name, line, column, callback = (error, pdfPositions) ->) ->
syncFromCode: (project_id, user_id, file_name, line, column, callback = (error, pdfPositions) ->) ->
# If LaTeX was run in a virtual environment, the file path that synctex expects
# might not match the file path on the host. The .synctex.gz file however, will be accessed
# wherever it is on the host.
base_dir = Settings.path.synctexBaseDir(project_id)
compileName = getCompileName(project_id, user_id)
base_dir = Settings.path.synctexBaseDir(compileName)
file_path = base_dir + "/" + file_name
synctex_path = Path.join(Settings.path.compilesDir, project_id, "output.pdf")
compileDir = getCompileDir(project_id, user_id)
synctex_path = Path.join(compileDir, "output.pdf")
CompileManager._runSynctex ["code", synctex_path, file_path, line, column], (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, file_name: file_name, line: line, column: column, stdout: stdout, "synctex code output"
logger.log project_id: project_id, user_id:user_id, file_name: file_name, line: line, column: column, stdout: stdout, "synctex code output"
callback null, CompileManager._parseSynctexFromCodeOutput(stdout)
syncFromPdf: (project_id, page, h, v, callback = (error, filePositions) ->) ->
base_dir = Settings.path.synctexBaseDir(project_id)
synctex_path = Path.join(Settings.path.compilesDir, project_id, "output.pdf")
syncFromPdf: (project_id, user_id, page, h, v, callback = (error, filePositions) ->) ->
compileName = getCompileName(project_id, user_id)
base_dir = Settings.path.synctexBaseDir(compileName)
compileDir = getCompileDir(project_id, user_id)
synctex_path = Path.join(compileDir, "output.pdf")
CompileManager._runSynctex ["pdf", synctex_path, page, h, v], (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, page: page, h: h, v:v, stdout: stdout, "synctex pdf output"
logger.log project_id: project_id, user_id:user_id, page: page, h: h, v:v, stdout: stdout, "synctex pdf output"
callback null, CompileManager._parseSynctexFromPdfOutput(stdout, base_dir)
_checkFileExists: (path, callback = (error) ->) ->
synctexDir = Path.dirname(path)
synctexFile = Path.join(synctexDir, "output.synctex.gz")
fs.stat synctexDir, (error, stats) ->
if error?.code is 'ENOENT'
return callback(new Errors.NotFoundError("called synctex with no output directory"))
return callback(error) if error?
fs.stat synctexFile, (error, stats) ->
if error?.code is 'ENOENT'
return callback(new Errors.NotFoundError("called synctex with no output file"))
return callback(error) if error?
return callback(new Error("not a file")) if not stats?.isFile()
callback()
_runSynctex: (args, callback = (error, stdout) ->) ->
bin_path = Path.resolve(__dirname + "/../../bin/synctex")
seconds = 1000
child_process.execFile bin_path, args, timeout: 10 * seconds, (error, stdout, stderr) ->
outputFilePath = args[1]
CompileManager._checkFileExists outputFilePath, (error) ->
return callback(error) if error?
callback(null, stdout)
if Settings.clsi?.synctexCommandWrapper?
[bin_path, args] = Settings.clsi?.synctexCommandWrapper bin_path, args
child_process.execFile bin_path, args, timeout: 10 * seconds, (error, stdout, stderr) ->
if error?
logger.err err:error, args:args, "error running synctex"
return callback(error)
callback(null, stdout)
_parseSynctexFromCodeOutput: (output) ->
results = []
@@ -111,17 +276,23 @@ module.exports = CompileManager =
}
return results
wordcount: (project_id, file_name, callback = (error, pdfPositions) ->) ->
logger.log project_id:project_id, file_name:file_name, "running wordcount"
wordcount: (project_id, user_id, file_name, image, callback = (error, pdfPositions) ->) ->
logger.log project_id:project_id, user_id:user_id, file_name:file_name, image:image, "running wordcount"
file_path = "$COMPILE_DIR/" + file_name
command = [ "texcount", '-inc', file_path, "-out=" + file_path + ".wc"]
directory = Path.join(Settings.path.compilesDir, project_id)
command = [ "texcount", '-nocol', '-inc', file_path, "-out=" + file_path + ".wc"]
directory = getCompileDir(project_id, user_id)
timeout = 10 * 1000
compileName = getCompileName(project_id, user_id)
CommandRunner.run project_id, command, directory, timeout, (error) ->
CommandRunner.run compileName, command, directory, image, timeout, {}, (error) ->
return callback(error) if error?
stdout = fs.readFileSync(directory + "/" + file_name + ".wc", "utf-8")
callback null, CompileManager._parseWordcountFromOutput(stdout)
fs.readFile directory + "/" + file_name + ".wc", "utf-8", (err, stdout) ->
if err?
logger.err err:err, command:command, directory:directory, project_id:project_id, user_id:user_id, "error reading word count output"
return callback(err)
results = CompileManager._parseWordcountFromOutput(stdout)
logger.log project_id:project_id, user_id:user_id, wordcount: results, "word count results"
callback null, results
_parseWordcountFromOutput: (output) ->
results = {
@@ -133,6 +304,8 @@ module.exports = CompileManager =
elements: 0
mathInline: 0
mathDisplay: 0
errors: 0
messages: ""
}
for line in output.split("\n")
[data, info] = line.split(":")
@@ -152,4 +325,8 @@ module.exports = CompileManager =
results['mathInline'] = parseInt(info, 10)
if data.indexOf("Number of math displayed") > -1
results['mathDisplay'] = parseInt(info, 10)
if data is "(errors" # errors reported as (errors:123)
results['errors'] = parseInt(info, 10)
if line.indexOf("!!! ") > -1 # errors logged as !!! message !!!
results['messages'] += line + "\n"
return results

View File

@@ -0,0 +1,24 @@
Path = require 'path'
# here we coerce html, css and js to text/plain,
# otherwise choose correct mime type based on file extension,
# falling back to octet-stream
module.exports = ContentTypeMapper =
map: (path) ->
switch Path.extname(path)
when '.txt', '.html', '.js', '.css', '.svg'
return 'text/plain'
when '.csv'
return 'text/csv'
when '.pdf'
return 'application/pdf'
when '.png'
return 'image/png'
when '.jpg', '.jpeg'
return 'image/jpeg'
when '.tiff'
return 'image/tiff'
when '.gif'
return 'image/gif'
else
return 'application/octet-stream'

View File

@@ -0,0 +1,24 @@
fs = require "fs"
logger = require "logger-sharelatex"
module.exports = DraftModeManager =
injectDraftMode: (filename, callback = (error) ->) ->
fs.readFile filename, "utf8", (error, content) ->
return callback(error) if error?
# avoid adding draft mode more than once
if content?.indexOf("\\documentclass\[draft") >= 0
return callback()
modified_content = DraftModeManager._injectDraftOption content
logger.log {
content: content.slice(0,1024), # \documentclass is normally v near the top
modified_content: modified_content.slice(0,1024),
filename
}, "injected draft class"
fs.writeFile filename, modified_content, callback
_injectDraftOption: (content) ->
content
# With existing options (must be first, otherwise both are applied)
.replace(/\\documentclass\[/g, "\\documentclass[draft,")
# Without existing options
.replace(/\\documentclass\{/g, "\\documentclass[draft]{")

25
app/coffee/Errors.coffee Normal file
View File

@@ -0,0 +1,25 @@
NotFoundError = (message) ->
error = new Error(message)
error.name = "NotFoundError"
error.__proto__ = NotFoundError.prototype
return error
NotFoundError.prototype.__proto__ = Error.prototype
FilesOutOfSyncError = (message) ->
error = new Error(message)
error.name = "FilesOutOfSyncError"
error.__proto__ = FilesOutOfSyncError.prototype
return error
FilesOutOfSyncError.prototype.__proto__ = Error.prototype
AlreadyCompilingError = (message) ->
error = new Error(message)
error.name = "AlreadyCompilingError"
error.__proto__ = AlreadyCompilingError.prototype
return error
AlreadyCompilingError.prototype.__proto__ = Error.prototype
module.exports = Errors =
NotFoundError: NotFoundError
FilesOutOfSyncError: FilesOutOfSyncError
AlreadyCompilingError: AlreadyCompilingError

View File

@@ -4,17 +4,19 @@ logger = require "logger-sharelatex"
Metrics = require "./Metrics"
CommandRunner = require(Settings.clsi?.commandRunner or "./CommandRunner")
ProcessTable = {} # table of currently running jobs (pids or docker container names)
module.exports = LatexRunner =
runLatex: (project_id, options, callback = (error) ->) ->
{directory, mainFile, compiler, timeout} = options
{directory, mainFile, compiler, timeout, image, environment} = options
compiler ||= "pdflatex"
timeout ||= 60000 # milliseconds
logger.log directory: directory, compiler: compiler, timeout: timeout, mainFile: mainFile, "starting compile"
logger.log directory: directory, compiler: compiler, timeout: timeout, mainFile: mainFile, environment: environment, "starting compile"
# We want to run latexmk on the tex file which we will automatically
# generate from the Rtex/Rmd/md file.
mainFile = mainFile.replace(/\.(Rtex|md|Rmd)$/, ".tex")
mainFile = mainFile.replace(/\.(Rtex|md|Rmd)$/, ".md")
if compiler == "pdflatex"
command = LatexRunner._pdflatexCommand mainFile
@@ -26,32 +28,73 @@ module.exports = LatexRunner =
command = LatexRunner._lualatexCommand mainFile
else
return callback new Error("unknown compiler: #{compiler}")
if Settings.clsi?.strace
command = ["strace", "-o", "strace", "-ff"].concat(command)
CommandRunner.run project_id, command, directory, timeout, callback
# ignore the above and make a pandoc command
console.log(mainFile)
console.log(image)
image = "ivotron/pandoc"
command = ["-o", "$COMPILE_DIR/output.html", "/compile/" + mainFile]
_latexmkBaseCommand: [ "latexmk", "-cd", "-f", "-jobname=output", "-auxdir=$COMPILE_DIR", "-outdir=$COMPILE_DIR"]
id = "#{project_id}" # record running project under this id
ProcessTable[id] = CommandRunner.run project_id, command, directory, image, timeout, environment, (error, output) ->
delete ProcessTable[id]
return callback(error) if error?
runs = output?.stderr?.match(/^Run number \d+ of .*latex/mg)?.length or 0
failed = if output?.stdout?.match(/^Latexmk: Errors/m)? then 1 else 0
# counters from latexmk output
stats = {}
stats["latexmk-errors"] = failed
stats["latex-runs"] = runs
stats["latex-runs-with-errors"] = if failed then runs else 0
stats["latex-runs-#{runs}"] = 1
stats["latex-runs-with-errors-#{runs}"] = if failed then 1 else 0
# timing information from /usr/bin/time
timings = {}
stderr = output?.stderr
timings["cpu-percent"] = stderr?.match(/Percent of CPU this job got: (\d+)/m)?[1] or 0
timings["cpu-time"] = stderr?.match(/User time.*: (\d+.\d+)/m)?[1] or 0
timings["sys-time"] = stderr?.match(/System time.*: (\d+.\d+)/m)?[1] or 0
callback error, output, stats, timings
killLatex: (project_id, callback = (error) ->) ->
id = "#{project_id}"
logger.log {id:id}, "killing running compile"
if not ProcessTable[id]?
logger.warn {id}, "no such project to kill"
return callback(null)
else
CommandRunner.kill ProcessTable[id], callback
_latexmkBaseCommand: (Settings?.clsi?.latexmkCommandPrefix || []).concat([
"latexmk", "-cd", "-f", "-jobname=output", "-auxdir=$COMPILE_DIR", "-outdir=$COMPILE_DIR",
"-synctex=1","-interaction=batchmode"
])
_pdflatexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-pdf", "-e", "$pdflatex='pdflatex -synctex=1 -interaction=batchmode %O %S'",
"-pdf",
Path.join("$COMPILE_DIR", mainFile)
]
_latexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-pdfdvi", "-e", "$latex='latex -synctex=1 -interaction=batchmode %O %S'",
"-pdfdvi",
Path.join("$COMPILE_DIR", mainFile)
]
_xelatexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-xelatex", "-e", "$pdflatex='xelatex -synctex=1 -interaction=batchmode %O %S'",
"-xelatex",
Path.join("$COMPILE_DIR", mainFile)
]
_lualatexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-pdf", "-e", "$pdflatex='lualatex -synctex=1 -interaction=batchmode %O %S'",
"-lualatex",
Path.join("$COMPILE_DIR", mainFile)
]

View File

@@ -0,0 +1,23 @@
Settings = require('settings-sharelatex')
logger = require "logger-sharelatex"
Lockfile = require('lockfile') # from https://github.com/npm/lockfile
Errors = require "./Errors"
module.exports = LockManager =
LOCK_TEST_INTERVAL: 1000 # 50ms between each test of the lock
MAX_LOCK_WAIT_TIME: 15000 # 10s maximum time to spend trying to get the lock
LOCK_STALE: 5*60*1000 # 5 mins time until lock auto expires
runWithLock: (path, runner = ((releaseLock = (error) ->) ->), callback = ((error) ->)) ->
lockOpts =
wait: @MAX_LOCK_WAIT_TIME
pollPeriod: @LOCK_TEST_INTERVAL
stale: @LOCK_STALE
Lockfile.lock path, lockOpts, (error) ->
return callback new Errors.AlreadyCompilingError("compile in progress") if error?.code is 'EEXIST'
return callback(error) if error?
runner (error1, args...) ->
Lockfile.unlock path, (error2) ->
error = error1 or error2
return callback(error) if error?
callback(null, args...)

View File

@@ -4,12 +4,17 @@ fse = require "fs-extra"
Path = require "path"
logger = require "logger-sharelatex"
_ = require "underscore"
Settings = require "settings-sharelatex"
crypto = require "crypto"
OutputFileOptimiser = require "./OutputFileOptimiser"
module.exports = OutputCacheManager =
CACHE_SUBDIR: '.cache/clsi'
BUILD_REGEX: /^[0-9a-f]+$/ # build id is Date.now() converted to hex
ARCHIVE_SUBDIR: '.archive/clsi'
# build id is HEXDATE-HEXRANDOM from Date.now()and RandomBytes
# for backwards compatibility, make the randombytes part optional
BUILD_REGEX: /^[0-9a-f]+(-[0-9a-f]+)?$/
CACHE_LIMIT: 2 # maximum number of cache directories
CACHE_AGE: 60*60*1000 # up to one hour old
@@ -21,40 +26,33 @@ module.exports = OutputCacheManager =
# for invalid build id, return top level
return file
generateBuildId: (callback = (error, buildId) ->) ->
# generate a secure build id from Date.now() and 8 random bytes in hex
crypto.randomBytes 8, (err, buf) ->
return callback(err) if err?
random = buf.toString('hex')
date = Date.now().toString(16)
callback err, "#{date}-#{random}"
saveOutputFiles: (outputFiles, compileDir, callback = (error) ->) ->
OutputCacheManager.generateBuildId (err, buildId) ->
return callback(err) if err?
OutputCacheManager.saveOutputFilesInBuildDir outputFiles, compileDir, buildId, callback
saveOutputFilesInBuildDir: (outputFiles, compileDir, buildId, callback = (error) ->) ->
# make a compileDir/CACHE_SUBDIR/build_id directory and
# copy all the output files into it
cacheRoot = Path.join(compileDir, OutputCacheManager.CACHE_SUBDIR)
# Put the files into a new cache subdirectory
buildId = Date.now().toString(16)
cacheDir = Path.join(compileDir, OutputCacheManager.CACHE_SUBDIR, buildId)
# let file expiry run in the background
OutputCacheManager.expireOutputFiles cacheRoot, {keep: buildId}
# Is it a per-user compile? check if compile directory is PROJECTID-USERID
perUser = Path.basename(compileDir).match(/^[0-9a-f]{24}-[0-9a-f]{24}$/)
checkFile = (src, callback) ->
# check if we have a valid file to copy into the cache
fs.stat src, (err, stats) ->
# Archive logs in background
if Settings.clsi?.archive_logs or Settings.clsi?.strace
OutputCacheManager.archiveLogs outputFiles, compileDir, buildId, (err) ->
if err?
# some problem reading the file
logger.error err: err, file: src, "stat error for file in cache"
callback(err)
else if not stats.isFile()
# other filetype - reject it
logger.error err: err, src: src, dst: dst, stat: stats, "nonfile output - refusing to copy to cache"
callback(new Error("output file is not a file"), file)
else
# it's a plain file, ok to copy
callback(null)
copyFile = (src, dst, callback) ->
# copy output file into the cache
fse.copy src, dst, (err) ->
if err?
logger.error err: err, src: src, dst: dst, "copy error for file in cache"
callback(err)
else
# call the optimiser for the file too
OutputFileOptimiser.optimiseFile src, dst, callback
logger.warn err:err, "erroring archiving log files"
# make the new cache directory
fse.ensureDir cacheDir, (err) ->
@@ -63,21 +61,57 @@ module.exports = OutputCacheManager =
callback(err, outputFiles)
else
# copy all the output files into the new cache directory
results = []
async.mapSeries outputFiles, (file, cb) ->
# don't send dot files as output, express doesn't serve them
if OutputCacheManager._fileIsHidden(file.path)
logger.warn compileDir: compileDir, path: file.path, "ignoring dotfile in output"
return cb()
# copy other files into cache directory if valid
newFile = _.clone(file)
[src, dst] = [Path.join(compileDir, file.path), Path.join(cacheDir, file.path)]
checkFile src, (err) ->
copyFile src, dst, (err) ->
if not err?
OutputCacheManager._checkFileIsSafe src, (err, isSafe) ->
return cb(err) if err?
if !isSafe
return cb()
OutputCacheManager._checkIfShouldCopy src, (err, shouldCopy) ->
return cb(err) if err?
if !shouldCopy
return cb()
OutputCacheManager._copyFile src, dst, (err) ->
return cb(err) if err?
newFile.build = buildId # attach a build id if we cached the file
cb(err, newFile)
, (err, results) ->
results.push newFile
cb()
, (err) ->
if err?
# pass back the original files if we encountered *any* error
callback(err, outputFiles)
# clean up the directory we just created
fse.remove cacheDir, (err) ->
if err?
logger.error err: err, dir: cacheDir, "error removing cache dir after failure"
else
# pass back the list of new files in the cache
callback(err, results)
# let file expiry run in the background, expire all previous files if per-user
OutputCacheManager.expireOutputFiles cacheRoot, {keep: buildId, limit: if perUser then 1 else null}
archiveLogs: (outputFiles, compileDir, buildId, callback = (error) ->) ->
archiveDir = Path.join(compileDir, OutputCacheManager.ARCHIVE_SUBDIR, buildId)
logger.log {dir: archiveDir}, "archiving log files for project"
fse.ensureDir archiveDir, (err) ->
return callback(err) if err?
async.mapSeries outputFiles, (file, cb) ->
[src, dst] = [Path.join(compileDir, file.path), Path.join(archiveDir, file.path)]
OutputCacheManager._checkFileIsSafe src, (err, isSafe) ->
return cb(err) if err?
return cb() if !isSafe
OutputCacheManager._checkIfShouldArchive src, (err, shouldArchive) ->
return cb(err) if err?
return cb() if !shouldArchive
OutputCacheManager._copyFile src, dst, cb
, callback
expireOutputFiles: (cacheRoot, options, callback = (error) ->) ->
# look in compileDir for build dirs and delete if > N or age of mod time > T
@@ -92,10 +126,13 @@ module.exports = OutputCacheManager =
isExpired = (dir, index) ->
return false if options?.keep == dir
# remove any directories over the requested (non-null) limit
return true if options?.limit? and index > options.limit
# remove any directories over the hard limit
return true if index > OutputCacheManager.CACHE_LIMIT
# we can get the build time from the directory name
dirTime = parseInt(dir, 16)
# we can get the build time from the first part of the directory name DDDD-RRRR
# DDDD is date and RRRR is random bytes
dirTime = parseInt(dir.split('-')?[0], 16)
age = currentTime - dirTime
return age > OutputCacheManager.CACHE_AGE
@@ -111,3 +148,52 @@ module.exports = OutputCacheManager =
async.eachSeries toRemove, (dir, cb) ->
removeDir dir, cb
, callback
_fileIsHidden: (path) ->
return path?.match(/^\.|\/./)?
_checkFileIsSafe: (src, callback = (error, isSafe) ->) ->
# check if we have a valid file to copy into the cache
fs.stat src, (err, stats) ->
if err?.code is 'ENOENT'
logger.warn err: err, file: src, "file has disappeared before copying to build cache"
callback(err, false)
else if err?
# some other problem reading the file
logger.error err: err, file: src, "stat error for file in cache"
callback(err, false)
else if not stats.isFile()
# other filetype - reject it
logger.warn src: src, stat: stats, "nonfile output - refusing to copy to cache"
callback(null, false)
else
# it's a plain file, ok to copy
callback(null, true)
_copyFile: (src, dst, callback) ->
# copy output file into the cache
fse.copy src, dst, (err) ->
if err?.code is 'ENOENT'
logger.warn err: err, file: src, "file has disappeared when copying to build cache"
callback(err, false)
else if err?
logger.error err: err, src: src, dst: dst, "copy error for file in cache"
callback(err)
else
if Settings.clsi?.optimiseInDocker
# don't run any optimisations on the pdf when they are done
# in the docker container
callback()
else
# call the optimiser for the file too
OutputFileOptimiser.optimiseFile src, dst, callback
_checkIfShouldCopy: (src, callback = (err, shouldCopy) ->) ->
return callback(null, !Path.basename(src).match(/^strace/))
_checkIfShouldArchive: (src, callback = (err, shouldCopy) ->) ->
if Path.basename(src).match(/^strace/)
return callback(null, true)
if Settings.clsi?.archive_logs and Path.basename(src) in ["output.log", "output.blg"]
return callback(null, true)
return callback(null, false)

View File

@@ -5,7 +5,7 @@ spawn = require("child_process").spawn
logger = require "logger-sharelatex"
module.exports = OutputFileFinder =
findOutputFiles: (resources, directory, callback = (error, outputFiles) ->) ->
findOutputFiles: (resources, directory, callback = (error, outputFiles, allFiles) ->) ->
incomingResources = {}
for resource in resources
incomingResources[resource.path] = true
@@ -13,8 +13,9 @@ module.exports = OutputFileFinder =
logger.log directory: directory, "getting output files"
OutputFileFinder._getAllFiles directory, (error, allFiles = []) ->
return callback(error) if error?
jobs = []
if error?
logger.err err:error, "error finding all output files"
return callback(error)
outputFiles = []
for file in allFiles
if !incomingResources[file]
@@ -22,14 +23,16 @@ module.exports = OutputFileFinder =
path: file
type: file.match(/\.([^\.]+)$/)?[1]
}
callback null, outputFiles
callback null, outputFiles, allFiles
_getAllFiles: (directory, _callback = (error, fileList) ->) ->
callback = (error, fileList) ->
_callback(error, fileList)
_callback = () ->
args = [directory, "-name", ".cache", "-prune", "-o", "-type", "f", "-print"]
# don't include clsi-specific files/directories in the output list
EXCLUDE_DIRS = ["-name", ".cache", "-o", "-name", ".archive","-o", "-name", ".project-*"]
args = [directory, "(", EXCLUDE_DIRS..., ")", "-prune", "-o", "-type", "f", "-print"]
logger.log args: args, "running find command"
proc = spawn("find", args)

View File

@@ -2,6 +2,7 @@ fs = require "fs"
Path = require "path"
spawn = require("child_process").spawn
logger = require "logger-sharelatex"
Metrics = require "./Metrics"
_ = require "underscore"
module.exports = OutputFileOptimiser =
@@ -10,15 +11,31 @@ module.exports = OutputFileOptimiser =
# check output file (src) and see if we can optimise it, storing
# the result in the build directory (dst)
if src.match(/\/output\.pdf$/)
OutputFileOptimiser.optimisePDF src, dst, callback
OutputFileOptimiser.checkIfPDFIsOptimised src, (err, isOptimised) ->
return callback(null) if err? or isOptimised
OutputFileOptimiser.optimisePDF src, dst, callback
else
callback (null)
checkIfPDFIsOptimised: (file, callback) ->
SIZE = 16*1024 # check the header of the pdf
result = new Buffer(SIZE)
result.fill(0) # prevent leakage of uninitialised buffer
fs.open file, "r", (err, fd) ->
return callback(err) if err?
fs.read fd, result, 0, SIZE, 0, (errRead, bytesRead, buffer) ->
fs.close fd, (errClose) ->
return callback(errRead) if errRead?
return callback(errClose) if errReadClose?
isOptimised = buffer.toString('ascii').indexOf("/Linearized 1") >= 0
callback(null, isOptimised)
optimisePDF: (src, dst, callback = (error) ->) ->
tmpOutput = dst + '.opt'
args = ["--linearize", src, tmpOutput]
logger.log args: args, "running qpdf command"
timer = new Metrics.Timer("qpdf")
proc = spawn("qpdf", args)
stdout = ""
proc.stdout.on "data", (chunk) ->
@@ -28,6 +45,7 @@ module.exports = OutputFileOptimiser =
logger.warn {err, args}, "qpdf failed"
callback(null) # ignore the error
proc.on "close", (code) ->
timer.done()
if code != 0
logger.warn {code, args}, "qpdf returned error"
return callback(null) # ignore the error

View File

@@ -3,9 +3,12 @@ CompileManager = require "./CompileManager"
db = require "./db"
async = require "async"
logger = require "logger-sharelatex"
oneDay = 24 * 60 * 60 * 1000
Settings = require "settings-sharelatex"
module.exports = ProjectPersistenceManager =
EXPIRY_TIMEOUT: oneDay = 24 * 60 * 60 * 1000 #ms
EXPIRY_TIMEOUT: Settings.project_cache_length_ms || oneDay * 2.5
markProjectAsJustAccessed: (project_id, callback = (error) ->) ->
db.Project.findOrCreate(where: {project_id: project_id})
@@ -24,21 +27,30 @@ module.exports = ProjectPersistenceManager =
jobs = for project_id in (project_ids or [])
do (project_id) ->
(callback) ->
ProjectPersistenceManager.clearProject project_id, (err) ->
ProjectPersistenceManager.clearProjectFromCache project_id, (err) ->
if err?
logger.error err: err, project_id: project_id, "error clearing project"
callback()
async.series jobs, callback
clearProject: (project_id, callback = (error) ->) ->
logger.log project_id: project_id, "clearing project"
CompileManager.clearProject project_id, (error) ->
return callback(error) if error?
UrlCache.clearProject project_id, (error) ->
async.series jobs, (error) ->
return callback(error) if error?
ProjectPersistenceManager._clearProjectFromDatabase project_id, (error) ->
return callback(error) if error?
callback()
CompileManager.clearExpiredProjects ProjectPersistenceManager.EXPIRY_TIMEOUT, (error) ->
callback() # ignore any errors from deleting directories
clearProject: (project_id, user_id, callback = (error) ->) ->
logger.log project_id: project_id, user_id:user_id, "clearing project for user"
CompileManager.clearProject project_id, user_id, (error) ->
return callback(error) if error?
ProjectPersistenceManager.clearProjectFromCache project_id, (error) ->
return callback(error) if error?
callback()
clearProjectFromCache: (project_id, callback = (error) ->) ->
logger.log project_id: project_id, "clearing project from cache"
UrlCache.clearProject project_id, (error) ->
return callback(error) if error?
ProjectPersistenceManager._clearProjectFromDatabase project_id, (error) ->
return callback(error) if error?
callback()
_clearProjectFromDatabase: (project_id, callback = (error) ->) ->
db.Project.destroy(where: {project_id: project_id})
@@ -50,3 +62,5 @@ module.exports = ProjectPersistenceManager =
.then((projects) ->
callback null, projects.map((project) -> project.project_id)
).error callback
logger.log {EXPIRY_TIMEOUT: ProjectPersistenceManager.EXPIRY_TIMEOUT}, "project assets kept timeout"

View File

@@ -21,6 +21,37 @@ module.exports = RequestParser =
compile.options.timeout
default: RequestParser.MAX_TIMEOUT
type: "number"
response.imageName = @_parseAttribute "imageName",
compile.options.imageName,
type: "string"
response.draft = @_parseAttribute "draft",
compile.options.draft,
default: false,
type: "boolean"
response.check = @_parseAttribute "check",
compile.options.check,
type: "string"
# The syncType specifies whether the request contains all
# resources (full) or only those resources to be updated
# in-place (incremental).
response.syncType = @_parseAttribute "syncType",
compile.options.syncType,
validValues: ["full", "incremental"]
type: "string"
# The syncState is an identifier passed in with the request
# which has the property that it changes when any resource is
# added, deleted, moved or renamed.
#
# on syncType full the syncState identifier is passed in and
# stored
#
# on syncType incremental the syncState identifier must match
# the stored value
response.syncState = @_parseAttribute "syncState",
compile.options.syncState,
type: "string"
if response.timeout > RequestParser.MAX_TIMEOUT
response.timeout = RequestParser.MAX_TIMEOUT
@@ -32,7 +63,13 @@ module.exports = RequestParser =
compile.rootResourcePath
default: "main.tex"
type: "string"
response.rootResourcePath = RequestParser._sanitizePath(rootResourcePath)
originalRootResourcePath = rootResourcePath
sanitizedRootResourcePath = RequestParser._sanitizePath(rootResourcePath)
response.rootResourcePath = RequestParser._checkPath(sanitizedRootResourcePath)
for resource in response.resources
if resource.path == originalRootResourcePath
resource.path = sanitizedRootResourcePath
catch error
return callback error
@@ -71,9 +108,15 @@ module.exports = RequestParser =
throw "#{name} attribute should be a #{options.type}"
else
return options.default if options.default?
throw "Default not implemented"
return attribute
_sanitizePath: (path) ->
# See http://php.net/manual/en/function.escapeshellcmd.php
path.replace(/[\#\&\;\`\|\*\?\~\<\>\^\(\)\[\]\{\}\$\\\x0A\xFF\x00]/g, "")
_checkPath: (path) ->
# check that the request does not use a relative path
for dir in path.split('/')
if dir == '..'
throw "relative path in root resource"
return path

View File

@@ -0,0 +1,72 @@
Path = require "path"
fs = require "fs"
logger = require "logger-sharelatex"
settings = require("settings-sharelatex")
Errors = require "./Errors"
SafeReader = require "./SafeReader"
module.exports = ResourceStateManager =
# The sync state is an identifier which must match for an
# incremental update to be allowed.
#
# The initial value is passed in and stored on a full
# compile, along with the list of resources..
#
# Subsequent incremental compiles must come with the same value - if
# not they will be rejected with a 409 Conflict response. The
# previous list of resources is returned.
#
# An incremental compile can only update existing files with new
# content. The sync state identifier must change if any docs or
# files are moved, added, deleted or renamed.
SYNC_STATE_FILE: ".project-sync-state"
SYNC_STATE_MAX_SIZE: 128*1024
saveProjectState: (state, resources, basePath, callback = (error) ->) ->
stateFile = Path.join(basePath, @SYNC_STATE_FILE)
if not state? # remove the file if no state passed in
logger.log state:state, basePath:basePath, "clearing sync state"
fs.unlink stateFile, (err) ->
if err? and err.code isnt 'ENOENT'
return callback(err)
else
return callback()
else
logger.log state:state, basePath:basePath, "writing sync state"
resourceList = (resource.path for resource in resources)
fs.writeFile stateFile, [resourceList..., "stateHash:#{state}"].join("\n"), callback
checkProjectStateMatches: (state, basePath, callback = (error, resources) ->) ->
stateFile = Path.join(basePath, @SYNC_STATE_FILE)
size = @SYNC_STATE_MAX_SIZE
SafeReader.readFile stateFile, size, 'utf8', (err, result, bytesRead) ->
return callback(err) if err?
if bytesRead is size
logger.error file:stateFile, size:size, bytesRead:bytesRead, "project state file truncated"
[resourceList..., oldState] = result?.toString()?.split("\n") or []
newState = "stateHash:#{state}"
logger.log state:state, oldState: oldState, basePath:basePath, stateMatches: (newState is oldState), "checking sync state"
if newState isnt oldState
return callback new Errors.FilesOutOfSyncError("invalid state for incremental update")
else
resources = ({path: path} for path in resourceList)
callback(null, resources)
checkResourceFiles: (resources, allFiles, basePath, callback = (error) ->) ->
# check the paths are all relative to current directory
for file in resources or []
for dir in file?.path?.split('/')
if dir == '..'
return callback new Error("relative path in resource file list")
# check if any of the input files are not present in list of files
seenFile = {}
for file in allFiles
seenFile[file] = true
missingFiles = (resource.path for resource in resources when not seenFile[resource.path])
if missingFiles?.length > 0
logger.err missingFiles:missingFiles, basePath:basePath, allFiles:allFiles, resources:resources, "missing input files for project"
return callback new Errors.FilesOutOfSyncError("resource files missing in incremental update")
else
callback()

View File

@@ -4,25 +4,71 @@ fs = require "fs"
async = require "async"
mkdirp = require "mkdirp"
OutputFileFinder = require "./OutputFileFinder"
ResourceStateManager = require "./ResourceStateManager"
Metrics = require "./Metrics"
logger = require "logger-sharelatex"
settings = require("settings-sharelatex")
parallelFileDownloads = settings.parallelFileDownloads or 1
module.exports = ResourceWriter =
syncResourcesToDisk: (project_id, resources, basePath, callback = (error) ->) ->
@_removeExtraneousFiles resources, basePath, (error) =>
syncResourcesToDisk: (request, basePath, callback = (error, resourceList) ->) ->
if request.syncType is "incremental"
logger.log project_id: request.project_id, user_id: request.user_id, "incremental sync"
ResourceStateManager.checkProjectStateMatches request.syncState, basePath, (error, resourceList) ->
return callback(error) if error?
ResourceWriter._removeExtraneousFiles resourceList, basePath, (error, outputFiles, allFiles) ->
return callback(error) if error?
ResourceStateManager.checkResourceFiles resourceList, allFiles, basePath, (error) ->
return callback(error) if error?
ResourceWriter.saveIncrementalResourcesToDisk request.project_id, request.resources, basePath, (error) ->
return callback(error) if error?
callback(null, resourceList)
else
logger.log project_id: request.project_id, user_id: request.user_id, "full sync"
@saveAllResourcesToDisk request.project_id, request.resources, basePath, (error) ->
return callback(error) if error?
ResourceStateManager.saveProjectState request.syncState, request.resources, basePath, (error) ->
return callback(error) if error?
callback(null, request.resources)
saveIncrementalResourcesToDisk: (project_id, resources, basePath, callback = (error) ->) ->
@_createDirectory basePath, (error) =>
return callback(error) if error?
jobs = for resource in resources
do (resource) =>
(callback) => @_writeResourceToDisk(project_id, resource, basePath, callback)
async.series jobs, callback
async.parallelLimit jobs, parallelFileDownloads, callback
_removeExtraneousFiles: (resources, basePath, _callback = (error) ->) ->
saveAllResourcesToDisk: (project_id, resources, basePath, callback = (error) ->) ->
@_createDirectory basePath, (error) =>
return callback(error) if error?
@_removeExtraneousFiles resources, basePath, (error) =>
return callback(error) if error?
jobs = for resource in resources
do (resource) =>
(callback) => @_writeResourceToDisk(project_id, resource, basePath, callback)
async.parallelLimit jobs, parallelFileDownloads, callback
_createDirectory: (basePath, callback = (error) ->) ->
fs.mkdir basePath, (err) ->
if err?
if err.code is 'EEXIST'
return callback()
else
logger.log {err: err, dir:basePath}, "error creating directory"
return callback(err)
else
return callback()
_removeExtraneousFiles: (resources, basePath, _callback = (error, outputFiles, allFiles) ->) ->
timer = new Metrics.Timer("unlink-output-files")
callback = (error) ->
callback = (error, result...) ->
timer.done()
_callback(error)
_callback(error, result...)
OutputFileFinder.findOutputFiles resources, basePath, (error, outputFiles) ->
OutputFileFinder.findOutputFiles resources, basePath, (error, outputFiles, allFiles) ->
return callback(error) if error?
jobs = []
@@ -30,37 +76,55 @@ module.exports = ResourceWriter =
do (file) ->
path = file.path
should_delete = true
if path.match(/^output\./) or path.match(/\.aux$/)
if path.match(/^output\./) or path.match(/\.aux$/) or path.match(/^cache\//) # knitr cache
should_delete = false
if path == "output.pdf" or path == "output.dvi" or path == "output.log"
if path.match(/^output-.*/) # Tikz cached figures
should_delete = false
if path == "output.pdf" or path == "output.dvi" or path == "output.log" or path == "output.xdv"
should_delete = true
if path == "output.tex" # created by TikzManager if present in output files
should_delete = true
if should_delete
jobs.push (callback) -> ResourceWriter._deleteFileIfNotDirectory Path.join(basePath, path), callback
async.series jobs, callback
async.series jobs, (error) ->
return callback(error) if error?
callback(null, outputFiles, allFiles)
_deleteFileIfNotDirectory: (path, callback = (error) ->) ->
fs.stat path, (error, stat) ->
return callback(error) if error?
if stat.isFile()
fs.unlink path, callback
if error? and error.code is 'ENOENT'
return callback()
else if error?
logger.err {err: error, path: path}, "error stating file in deleteFileIfNotDirectory"
return callback(error)
else if stat.isFile()
fs.unlink path, (error) ->
if error?
logger.err {err: error, path: path}, "error removing file in deleteFileIfNotDirectory"
callback(error)
else
callback()
else
callback()
_writeResourceToDisk: (project_id, resource, basePath, callback = (error) ->) ->
path = Path.normalize(Path.join(basePath, resource.path))
if (path.slice(0, basePath.length) != basePath)
return callback new Error("resource path is outside root directory")
mkdirp Path.dirname(path), (error) ->
ResourceWriter.checkPath basePath, resource.path, (error, path) ->
return callback(error) if error?
# TODO: Don't overwrite file if it hasn't been modified
if resource.url?
UrlCache.downloadUrlToFile project_id, resource.url, path, resource.modified, (err)->
if err?
logger.err err:err, project_id:project_id, path:path, resource_url:resource.url, modified:resource.modified, "error downloading file for resources"
callback() #try and continue compiling even if http resource can not be downloaded at this time
else
fs.writeFile path, resource.content, callback
mkdirp Path.dirname(path), (error) ->
return callback(error) if error?
# TODO: Don't overwrite file if it hasn't been modified
if resource.url?
UrlCache.downloadUrlToFile project_id, resource.url, path, resource.modified, (err)->
if err?
logger.err err:err, project_id:project_id, path:path, resource_url:resource.url, modified:resource.modified, "error downloading file for resources"
callback() #try and continue compiling even if http resource can not be downloaded at this time
else
fs.writeFile path, resource.content, callback
checkPath: (basePath, resourcePath, callback) ->
path = Path.normalize(Path.join(basePath, resourcePath))
if (path.slice(0, basePath.length + 1) != basePath + "/")
return callback new Error("resource path is outside root directory")
else
return callback(null, path)

View File

@@ -0,0 +1,25 @@
fs = require "fs"
logger = require "logger-sharelatex"
module.exports = SafeReader =
# safely read up to size bytes from a file and return result as a
# string
readFile: (file, size, encoding, callback = (error, result) ->) ->
fs.open file, 'r', (err, fd) ->
return callback() if err? and err.code is 'ENOENT'
return callback(err) if err?
# safely return always closing the file
callbackWithClose = (err, result...) ->
fs.close fd, (err1) ->
return callback(err) if err?
return callback(err1) if err1?
callback(null, result...)
buff = new Buffer(size, 0) # fill with zeros
fs.read fd, buff, 0, buff.length, 0, (err, bytesRead, buffer) ->
return callbackWithClose(err) if err?
result = buffer.toString(encoding, 0, bytesRead)
callbackWithClose(null, result, bytesRead)

View File

@@ -29,10 +29,10 @@ module.exports = ForbidSymlinks = (staticFn, root, options) ->
# check that the requested path is not a symlink
fs.realpath requestedFsPath, (err, realFsPath)->
if err?
logger.warn err:err, requestedFsPath:requestedFsPath, realFsPath:realFsPath, path: req.params[0], project_id: req.params.project_id, "error checking file access"
if err.code == 'ENOENT'
return res.sendStatus(404)
else
logger.error err:err, requestedFsPath:requestedFsPath, realFsPath:realFsPath, path: req.params[0], project_id: req.params.project_id, "error checking file access"
return res.sendStatus(500)
else if requestedFsPath != realFsPath
logger.warn requestedFsPath:requestedFsPath, realFsPath:realFsPath, path: req.params[0], project_id: req.params.project_id, "trying to access a different file (symlink), aborting"

View File

@@ -0,0 +1,35 @@
fs = require "fs"
Path = require "path"
ResourceWriter = require "./ResourceWriter"
SafeReader = require "./SafeReader"
logger = require "logger-sharelatex"
# for \tikzexternalize to work the main file needs to match the
# jobname. Since we set the -jobname to output, we have to create a
# copy of the main file as 'output.tex'.
module.exports = TikzManager =
checkMainFile: (compileDir, mainFile, resources, callback = (error, usesTikzExternalize) ->) ->
# if there's already an output.tex file, we don't want to touch it
for resource in resources
if resource.path is "output.tex"
logger.log compileDir: compileDir, mainFile: mainFile, "output.tex already in resources"
return callback(null, false)
# if there's no output.tex, see if we are using tikz/pgf in the main file
ResourceWriter.checkPath compileDir, mainFile, (error, path) ->
return callback(error) if error?
SafeReader.readFile path, 65536, "utf8", (error, content) ->
return callback(error) if error?
usesTikzExternalize = content?.indexOf("\\tikzexternalize") >= 0
logger.log compileDir: compileDir, mainFile: mainFile, usesTikzExternalize:usesTikzExternalize, "checked for tikzexternalize"
callback null, usesTikzExternalize
injectOutputFile: (compileDir, mainFile, callback = (error) ->) ->
ResourceWriter.checkPath compileDir, mainFile, (error, path) ->
return callback(error) if error?
fs.readFile path, "utf8", (error, content) ->
return callback(error) if error?
logger.log compileDir: compileDir, mainFile: mainFile, "copied file to output.tex for tikz"
# use wx flag to ensure that output file does not already exist
fs.writeFile Path.join(compileDir, "output.tex"), content, {flag:'wx'}, callback

View File

@@ -87,7 +87,11 @@ module.exports = UrlCache =
callback null
_deleteUrlCacheFromDisk: (project_id, url, callback = (error) ->) ->
fs.unlink UrlCache._cacheFilePathForUrl(project_id, url), callback
fs.unlink UrlCache._cacheFilePathForUrl(project_id, url), (error) ->
if error? and error.code != 'ENOENT' # no error if the file isn't present
return callback(error)
else
return callback()
_findUrlDetails: (project_id, url, callback = (error, urlDetails) ->) ->
db.UrlCache.find(where: { url: url, project_id: project_id })

View File

@@ -16,24 +16,29 @@ module.exports =
clsiCacheDir: Path.resolve(__dirname + "/../cache")
synctexBaseDir: (project_id) -> Path.join(@compilesDir, project_id)
# clsi:
# commandRunner: "docker-runner-sharelatex"
# docker:
# image: "quay.io/sharelatex/texlive-full"
# env:
# PATH: "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/texlive/2013/bin/x86_64-linux/"
# HOME: "/tmp"
# modem:
# socketPath: false
# user: "tex"
internal:
clsi:
port: 3013
host: "localhost"
host: process.env["LISTEN_ADDRESS"] or "localhost"
apis:
clsi:
url: "http://localhost:3013"
smokeTest: false
project_cache_length_ms: 1000 * 60 * 60 * 24
parallelFileDownloads:1
if process.env["COMMAND_RUNNER"]
module.exports.clsi =
commandRunner: process.env["COMMAND_RUNNER"]
docker:
image: process.env["TEXLIVE_IMAGE"] or "quay.io/sharelatex/texlive-full:2017.1"
env:
HOME: "/tmp"
socketPath: "/var/run/docker.sock"
user: process.env["TEXLIVE_IMAGE_USER"] or "tex"
expireProjectAfterIdleMs: 24 * 60 * 60 * 1000
checkProjectsIntervalMs: 10 * 60 * 1000
module.exports.path.sandboxedCompilesHostDir = process.env["COMPILES_HOST_DIR"]

View File

@@ -6,26 +6,32 @@
"type": "git",
"url": "https://github.com/sharelatex/clsi-sharelatex.git"
},
"scripts": {
"compile:app": "coffee -o app/js -c app/coffee && coffee -c app.coffee",
"start": "npm run compile:app && node app.js"
},
"author": "James Allen <james@sharelatex.com>",
"dependencies": {
"async": "0.2.9",
"body-parser": "^1.2.0",
"express": "^4.2.0",
"fs-extra": "^0.16.3",
"grunt-mkdir": "^1.0.0",
"heapdump": "^0.3.5",
"lockfile": "^1.0.3",
"logger-sharelatex": "git+https://github.com/sharelatex/logger-sharelatex.git#v1.5.4",
"lynx": "0.0.11",
"metrics-sharelatex": "git+https://github.com/sharelatex/metrics-sharelatex.git#v1.5.0",
"mkdirp": "0.3.5",
"mysql": "2.6.2",
"request": "~2.21.0",
"logger-sharelatex": "git+https://github.com/sharelatex/logger-sharelatex.git#v1.0.0",
"settings-sharelatex": "git+https://github.com/sharelatex/settings-sharelatex.git#v1.0.0",
"metrics-sharelatex": "git+https://github.com/sharelatex/metrics-sharelatex.git#v1.3.0",
"request": "^2.21.0",
"sequelize": "^2.1.3",
"wrench": "~1.5.4",
"settings-sharelatex": "git+https://github.com/sharelatex/settings-sharelatex.git#v1.0.0",
"smoke-test-sharelatex": "git+https://github.com/sharelatex/smoke-test-sharelatex.git#v0.2.0",
"sqlite3": "~2.2.0",
"express": "^4.2.0",
"body-parser": "^1.2.0",
"fs-extra": "^0.16.3",
"sqlite3": "~3.1.8",
"underscore": "^1.8.2",
"v8-profiler": "^5.2.4",
"heapdump": "^0.3.5"
"wrench": "~1.5.4"
},
"devDependencies": {
"mocha": "1.10.0",

View File

@@ -12,18 +12,35 @@ catch e
convertToPng = (pdfPath, pngPath, callback = (error) ->) ->
convert = ChildProcess.exec "convert #{fixturePath(pdfPath)} #{fixturePath(pngPath)}"
stdout = ""
convert.stdout.on "data", (chunk) -> console.log "STDOUT", chunk.toString()
convert.stderr.on "data", (chunk) -> console.log "STDERR", chunk.toString()
convert.on "exit", () ->
callback()
compare = (originalPath, generatedPath, callback = (error, same) ->) ->
proc = ChildProcess.exec "compare -metric mae #{fixturePath(originalPath)} #{fixturePath(generatedPath)} #{fixturePath("tmp/diff.png")}"
diff_file = "#{fixturePath(generatedPath)}-diff.png"
proc = ChildProcess.exec "compare -metric mae #{fixturePath(originalPath)} #{fixturePath(generatedPath)} #{diff_file}"
stderr = ""
proc.stderr.on "data", (chunk) -> stderr += chunk
proc.on "exit", () ->
if stderr.trim() == "0 (0)"
fs.unlink diff_file # remove output diff if test matches expected image
callback null, true
else
console.log stderr
console.log "compare result", stderr
callback null, false
checkPdfInfo = (pdfPath, callback = (error, output) ->) ->
proc = ChildProcess.exec "pdfinfo #{fixturePath(pdfPath)}"
stdout = ""
proc.stdout.on "data", (chunk) -> stdout += chunk
proc.stderr.on "data", (chunk) -> console.log "STDERR", chunk.toString()
proc.on "exit", () ->
if stdout.match(/Optimized:\s+yes/)
callback null, true
else
console.log "pdfinfo result", stdout
callback null, false
compareMultiplePages = (project_id, callback = (error) ->) ->
@@ -39,24 +56,30 @@ compareMultiplePages = (project_id, callback = (error) ->) ->
compareNext page_no + 1, callback
compareNext 0, callback
comparePdf = (project_id, example_dir, callback = (error) ->) ->
convertToPng "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png", (error) =>
throw error if error?
convertToPng "examples/#{example_dir}/output.pdf", "tmp/#{project_id}-source.png", (error) =>
throw error if error?
fs.stat fixturePath("tmp/#{project_id}-source-0.png"), (error, stat) =>
if error?
compare "tmp/#{project_id}-source.png", "tmp/#{project_id}-generated.png", (error, same) =>
throw error if error?
same.should.equal true
callback()
else
compareMultiplePages project_id, (error) ->
throw error if error?
callback()
downloadAndComparePdf = (project_id, example_dir, url, callback = (error) ->) ->
writeStream = fs.createWriteStream(fixturePath("tmp/#{project_id}.pdf"))
request.get(url).pipe(writeStream)
writeStream.on "close", () =>
convertToPng "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png", (error) =>
checkPdfInfo "tmp/#{project_id}.pdf", (error, optimised) =>
throw error if error?
convertToPng "examples/#{example_dir}/output.pdf", "tmp/#{project_id}-source.png", (error) =>
throw error if error?
fs.stat fixturePath("tmp/#{project_id}-source-0.png"), (error, stat) =>
if error?
compare "tmp/#{project_id}-source.png", "tmp/#{project_id}-generated.png", (error, same) =>
throw error if error?
same.should.equal true
callback()
else
compareMultiplePages project_id, (error) ->
throw error if error?
callback()
optimised.should.equal true
comparePdf project_id, example_dir, callback
Client.runServer(4242, fixturePath("examples"))
@@ -68,15 +91,19 @@ describe "Example Documents", ->
do (example_dir) ->
describe example_dir, ->
before ->
@project_id = Client.randomId()
@project_id = Client.randomId() + "_" + example_dir
it "should generate the correct pdf", (done) ->
Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) =>
if error || body?.compile?.status is "failure"
console.log "DEBUG: error", error, "body", JSON.stringify(body)
pdf = Client.getOutputFile body, "pdf"
downloadAndComparePdf(@project_id, example_dir, pdf.url, done)
it "should generate the correct pdf on the second run as well", (done) ->
Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) =>
if error || body?.compile?.status is "failure"
console.log "DEBUG: error", error, "body", JSON.stringify(body)
pdf = Client.getOutputFile body, "pdf"
downloadAndComparePdf(@project_id, example_dir, pdf.url, done)

View File

@@ -6,13 +6,14 @@ describe "Timed out compile", ->
before (done) ->
@request =
options:
timeout: 0.01 #seconds
timeout: 1 #seconds
resources: [
path: "main.tex"
content: '''
\\documentclass{article}
\\begin{document}
Hello world
\\input{|"sleep 10"}
\\end{document}
'''
]

View File

@@ -29,6 +29,8 @@ describe "Syncing", ->
elements: 0
mathInline: 6
mathDisplay: 0
errors: 0
messages: ""
}
)
done()

View File

@@ -30,7 +30,10 @@ module.exports = Client =
express = require("express")
app = express()
app.use express.static(directory)
app.listen(port, host)
app.listen(port, host).on "error", (error) ->
console.error "error starting server:", error.message
process.exit(1)
syncFromCode: (project_id, file, line, column, callback = (error, pdfPositions) ->) ->
request.get {

View File

@@ -0,0 +1,12 @@
\documentclass{article}
\usepackage{fontawesome}
\begin{document}
Cloud \faCloud
Cog \faCog
Database \faDatabase
Leaf \faLeaf
\end{document}

View File

@@ -0,0 +1,16 @@
\documentclass{article}
\usepackage{fontspec}
\defaultfontfeatures{Extension = .otf} % this is needed because
% fontawesome package loads by
% font name only
\usepackage{fontawesome}
\begin{document}
Cloud \faCloud
Cog \faCog
Database \faDatabase
Leaf \faLeaf
\end{document}

View File

@@ -0,0 +1,3 @@
{
"compiler": "xelatex"
}

View File

@@ -0,0 +1,14 @@
\documentclass{article}
\usepackage[utf8x]{inputenc}
\usepackage[hebrew,english]{babel}
\begin{document}
\selectlanguage{hebrew}
כדי לכתוב משהו באנגלית חייבים להשתמש במקרו הבא וכאן
ממשיכים לכתוב בעברית. טקסט נוסחאות תמיד יהיה בכיוון שמאל-לימין
\selectlanguage{english}
This is a test.
\end{document}

Binary file not shown.

View File

@@ -0,0 +1,35 @@
\documentclass{article}
\usepackage[utf8]{inputenc}
\usepackage[spanish]{babel}
\begin{document}
\tableofcontents
\vspace{2cm} %Add a 2cm space
\begin{abstract}
Este es un breve resumen del contenido del
documento escrito en español.
\end{abstract}
\section{Sección Introductoria}
Esta es la primera sección, podemos agregar
algunos elementos adicionales y todo será
escrito correctamente. Más aún, si una palabra
es demaciado larga y tiene que ser truncada,
babel tratará de truncarla correctamente
dependiendo del idioma.
\section{Sección con teoremas}
Esta sección es para ver que pasa con los comandos
que definen texto
%% chunk options: cache this chunk
%% begin.rcode my-cache, cache=TRUE
% set.seed(123)
% x = runif(10)
% sd(x) # standard deviation
%% end.rcode
\end{document}

View File

@@ -1,4 +1,4 @@
\documentclass{article}
\documentclass[a4paper]{article}
\usepackage{graphicx}

View File

@@ -0,0 +1,66 @@
\RequirePackage{luatex85}
\documentclass[tikz]{standalone}
\usepackage[compat=1.1.0]{tikz-feynman}
\begin{document}
\feynmandiagram [horizontal=a to b] {
i1 -- [fermion] a -- [fermion] i2,
a -- [photon] b,
f1 -- [fermion] b -- [fermion] f2,
};
\feynmandiagram [horizontal=a to b] {
i1 [particle=\(e^{-}\)] -- [fermion] a -- [fermion] i2 [particle=\(e^{+}\)],
a -- [photon, edge label=\(\gamma\), momentum'=\(k\)] b,
f1 [particle=\(\mu^{+}\)] -- [fermion] b -- [fermion] f2 [particle=\(\mu^{-}\)],
};
\feynmandiagram [large, vertical=e to f] {
a -- [fermion] b -- [photon, momentum=\(k\)] c -- [fermion] d,
b -- [fermion, momentum'=\(p_{1}\)] e -- [fermion, momentum'=\(p_{2}\)] c,
e -- [gluon] f,
h -- [fermion] f -- [fermion] i,
};
\begin{tikzpicture}
\begin{feynman}
\vertex (a1) {\(\overline b\)};
\vertex[right=1cm of a1] (a2);
\vertex[right=1cm of a2] (a3);
\vertex[right=1cm of a3] (a4) {\(b\)};
\vertex[right=1cm of a4] (a5);
\vertex[right=2cm of a5] (a6) {\(u\)};
\vertex[below=2em of a1] (b1) {\(d\)};
\vertex[right=1cm of b1] (b2);
\vertex[right=1cm of b2] (b3);
\vertex[right=1cm of b3] (b4) {\(\overline d\)};
\vertex[below=2em of a6] (b5) {\(\overline d\)};
\vertex[above=of a6] (c1) {\(\overline u\)};
\vertex[above=2em of c1] (c3) {\(d\)};
\vertex at ($(c1)!0.5!(c3) - (1cm, 0)$) (c2);
\diagram* {
{[edges=fermion]
(b1) -- (b2) -- (a2) -- (a1),
(b5) -- (b4) -- (b3) -- (a3) -- (a4) -- (a5) -- (a6),
},
(a2) -- [boson, edge label=\(W\)] (a3),
(b2) -- [boson, edge label'=\(W\)] (b3),
(c1) -- [fermion, out=180, in=-45] (c2) -- [fermion, out=45, in=180] (c3),
(a5) -- [boson, bend left, edge label=\(W^{-}\)] (c2),
};
\draw [decoration={brace}, decorate] (b1.south west) -- (a1.north west)
node [pos=0.5, left] {\(B^{0}\)};
\draw [decoration={brace}, decorate] (c3.north east) -- (c1.south east)
node [pos=0.5, right] {\(\pi^{-}\)};
\draw [decoration={brace}, decorate] (a6.north east) -- (b5.south east)
node [pos=0.5, right] {\(\pi^{+}\)};
\end{feynman}
\end{tikzpicture}
\end{document}

View File

@@ -0,0 +1,3 @@
{
"compiler": "lualatex"
}

View File

@@ -0,0 +1,23 @@
#!/bin/bash -x
export SHARELATEX_CONFIG=`pwd`/test/acceptance/scripts/settings.test.coffee
echo ">> Starting server..."
grunt --no-color >server.log 2>&1 &
echo ">> Server started"
sleep 5
echo ">> Running acceptance tests..."
grunt --no-color mochaTest:acceptance
_test_exit_code=$?
echo ">> Killing server"
kill %1
echo ">> Done"
exit $_test_exit_code

View File

@@ -0,0 +1,47 @@
Path = require "path"
module.exports =
# Options are passed to Sequelize.
# See http://sequelizejs.com/documentation#usage-options for details
mysql:
clsi:
database: "clsi"
username: "clsi"
password: null
dialect: "sqlite"
storage: Path.resolve("db.sqlite")
path:
compilesDir: Path.resolve(__dirname + "/../../../compiles")
clsiCacheDir: Path.resolve(__dirname + "/../../../cache")
#synctexBaseDir: (project_id) -> Path.join(@compilesDir, project_id)
synctexBaseDir: () -> "/compile"
sandboxedCompilesHostDir: process.env['SANDBOXED_COMPILES_HOST_DIR']
clsi:
#strace: true
#archive_logs: true
commandRunner: "docker-runner-sharelatex"
latexmkCommandPrefix: ["/usr/bin/time", "-v"] # on Linux
docker:
image: process.env.TEXLIVE_IMAGE || "texlive-full:2017.1-opt"
env:
PATH: "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/texlive/2017/bin/x86_64-linux/"
HOME: "/tmp"
modem:
socketPath: false
user: process.env.SIBLING_CONTAINER_USER ||"111"
internal:
clsi:
port: 3013
load_port: 3044
host: "localhost"
apis:
clsi:
url: "http://localhost:3013"
smokeTest: false
project_cache_length_ms: 1000 * 60 * 60 * 24
parallelFileDownloads:1

View File

@@ -6,19 +6,48 @@ Settings = require "settings-sharelatex"
buildUrl = (path) -> "http://#{Settings.internal.clsi.host}:#{Settings.internal.clsi.port}/#{path}"
url = buildUrl("project/smoketest-#{process.pid}/compile")
describe "Running a compile", ->
before (done) ->
request.post {
url: buildUrl("project/smoketest/compile")
url: url
json:
compile:
resources: [
path: "main.tex"
content: """
\\documentclass{article}
\\begin{document}
Hello world
\\end{document}
% Membrane-like surface
% Author: Yotam Avital
\\documentclass{article}
\\usepackage{tikz}
\\usetikzlibrary{calc,fadings,decorations.pathreplacing}
\\begin{document}
\\begin{tikzpicture}
\\def\\nuPi{3.1459265}
\\foreach \\i in {5,4,...,2}{% This one doesn't matter
\\foreach \\j in {3,2,...,0}{% This will crate a membrane
% with the front lipids visible
% top layer
\\pgfmathsetmacro{\\dx}{rand*0.1}% A random variance in the x coordinate
\\pgfmathsetmacro{\\dy}{rand*0.1}% A random variance in the y coordinate,
% gives a hight fill to the lipid
\\pgfmathsetmacro{\\rot}{rand*0.1}% A random variance in the
% molecule orientation
\\shade[ball color=red] ({\\i+\\dx+\\rot},{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)}) circle(0.45);
\\shade[ball color=gray] (\\i+\\dx,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-0.9}) circle(0.45);
\\shade[ball color=gray] (\\i+\\dx-\\rot,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-1.8}) circle(0.45);
% bottom layer
\\pgfmathsetmacro{\\dx}{rand*0.1}
\\pgfmathsetmacro{\\dy}{rand*0.1}
\\pgfmathsetmacro{\\rot}{rand*0.1}
\\shade[ball color=gray] (\\i+\\dx+\\rot,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-2.8}) circle(0.45);
\\shade[ball color=gray] (\\i+\\dx,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-3.7}) circle(0.45);
\\shade[ball color=red] (\\i+\\dx-\\rot,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-4.6}) circle(0.45);
}
}
\\end{tikzpicture}
\\end{document}
"""
]
}, (@error, @response, @body) =>

View File

@@ -49,7 +49,7 @@ describe "CompileController", ->
describe "successfully", ->
beforeEach ->
@CompileManager.doCompile = sinon.stub().callsArgWith(1, null, @output_files)
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, null, @output_files)
@CompileController.compile @req, @res
it "should parse the request", ->
@@ -58,7 +58,7 @@ describe "CompileController", ->
.should.equal true
it "should run the compile for the specified project", ->
@CompileManager.doCompile
@CompileManager.doCompileWithLock
.calledWith(@request_with_project_id)
.should.equal true
@@ -75,7 +75,8 @@ describe "CompileController", ->
status: "success"
error: null
outputFiles: @output_files.map (file) =>
url: "#{@Settings.apis.clsi.url}/project/#{@project_id}/output/#{file.path}"
url: "#{@Settings.apis.clsi.url}/project/#{@project_id}/build/#{file.build}/output/#{file.path}"
path: file.path
type: file.type
build: file.build
)
@@ -83,7 +84,7 @@ describe "CompileController", ->
describe "with an error", ->
beforeEach ->
@CompileManager.doCompile = sinon.stub().callsArgWith(1, new Error(@message = "error message"), null)
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, new Error(@message = "error message"), null)
@CompileController.compile @req, @res
it "should return the JSON response with the error", ->
@@ -101,7 +102,7 @@ describe "CompileController", ->
beforeEach ->
@error = new Error(@message = "container timed out")
@error.timedout = true
@CompileManager.doCompile = sinon.stub().callsArgWith(1, @error, null)
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, @error, null)
@CompileController.compile @req, @res
it "should return the JSON response with the timeout status", ->
@@ -117,7 +118,7 @@ describe "CompileController", ->
describe "when the request returns no output files", ->
beforeEach ->
@CompileManager.doCompile = sinon.stub().callsArgWith(1, null, [])
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, null, [])
@CompileController.compile @req, @res
it "should return the JSON response with the failure status", ->
@@ -145,12 +146,12 @@ describe "CompileController", ->
column: @column.toString()
@res.send = sinon.stub()
@CompileManager.syncFromCode = sinon.stub().callsArgWith(4, null, @pdfPositions = ["mock-positions"])
@CompileManager.syncFromCode = sinon.stub().callsArgWith(5, null, @pdfPositions = ["mock-positions"])
@CompileController.syncFromCode @req, @res, @next
it "should find the corresponding location in the PDF", ->
@CompileManager.syncFromCode
.calledWith(@project_id, @file, @line, @column)
.calledWith(@project_id, undefined, @file, @line, @column)
.should.equal true
it "should return the positions", ->
@@ -174,12 +175,12 @@ describe "CompileController", ->
v: @v.toString()
@res.send = sinon.stub()
@CompileManager.syncFromPdf = sinon.stub().callsArgWith(4, null, @codePositions = ["mock-positions"])
@CompileManager.syncFromPdf = sinon.stub().callsArgWith(5, null, @codePositions = ["mock-positions"])
@CompileController.syncFromPdf @req, @res, @next
it "should find the corresponding location in the code", ->
@CompileManager.syncFromPdf
.calledWith(@project_id, @page, @h, @v)
.calledWith(@project_id, undefined, @page, @h, @v)
.should.equal true
it "should return the positions", ->
@@ -197,14 +198,15 @@ describe "CompileController", ->
project_id: @project_id
@req.query =
file: @file
image: @image = "example.com/image"
@res.send = sinon.stub()
@CompileManager.wordcount = sinon.stub().callsArgWith(2, null, @texcount = ["mock-texcount"])
@CompileManager.wordcount = sinon.stub().callsArgWith(4, null, @texcount = ["mock-texcount"])
@CompileController.wordcount @req, @res, @next
it "should return the word count of a file", ->
@CompileManager.wordcount
.calledWith(@project_id, @file)
.calledWith(@project_id, undefined, @file, @image)
.should.equal true
it "should return the texcount info", ->

View File

@@ -14,12 +14,66 @@ describe "CompileManager", ->
"./OutputFileFinder": @OutputFileFinder = {}
"./OutputCacheManager": @OutputCacheManager = {}
"settings-sharelatex": @Settings = { path: compilesDir: "/compiles/dir" }
"logger-sharelatex": @logger = { log: sinon.stub() }
"logger-sharelatex": @logger = { log: sinon.stub() , info:->}
"child_process": @child_process = {}
"./CommandRunner": @CommandRunner = {}
"./DraftModeManager": @DraftModeManager = {}
"./TikzManager": @TikzManager = {}
"./LockManager": @LockManager = {}
"fs": @fs = {}
"fs-extra": @fse = { ensureDir: sinon.stub().callsArg(1) }
@callback = sinon.stub()
describe "doCompileWithLock", ->
beforeEach ->
@request =
resources: @resources = "mock-resources"
project_id: @project_id = "project-id-123"
user_id: @user_id = "1234"
@output_files = ["foo", "bar"]
@Settings.compileDir = "compiles"
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@CompileManager.doCompile = sinon.stub().callsArgWith(1, null, @output_files)
@LockManager.runWithLock = (lockFile, runner, callback) ->
runner (err, result...) ->
callback(err, result...)
describe "when the project is not locked", ->
beforeEach ->
@CompileManager.doCompileWithLock @request, @callback
it "should ensure that the compile directory exists", ->
@fse.ensureDir.calledWith(@compileDir)
.should.equal true
it "should call doCompile with the request", ->
@CompileManager.doCompile
.calledWith(@request)
.should.equal true
it "should call the callback with the output files", ->
@callback.calledWithExactly(null, @output_files)
.should.equal true
describe "when the project is locked", ->
beforeEach ->
@error = new Error("locked")
@LockManager.runWithLock = (lockFile, runner, callback) =>
callback(@error)
@CompileManager.doCompileWithLock @request, @callback
it "should ensure that the compile directory exists", ->
@fse.ensureDir.calledWith(@compileDir)
.should.equal true
it "should not call doCompile with the request", ->
@CompileManager.doCompile
.called.should.equal false
it "should call the callback with the error", ->
@callback.calledWithExactly(@error)
.should.equal true
describe "doCompile", ->
beforeEach ->
@output_files = [{
@@ -42,53 +96,112 @@ describe "CompileManager", ->
resources: @resources = "mock-resources"
rootResourcePath: @rootResourcePath = "main.tex"
project_id: @project_id = "project-id-123"
user_id: @user_id = "1234"
compiler: @compiler = "pdflatex"
timeout: @timeout = 42000
imageName: @image = "example.com/image"
@env = {}
@Settings.compileDir = "compiles"
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}"
@ResourceWriter.syncResourcesToDisk = sinon.stub().callsArg(3)
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@ResourceWriter.syncResourcesToDisk = sinon.stub().callsArgWith(2, null, @resources)
@LatexRunner.runLatex = sinon.stub().callsArg(2)
@OutputFileFinder.findOutputFiles = sinon.stub().callsArgWith(2, null, @output_files)
@OutputCacheManager.saveOutputFiles = sinon.stub().callsArgWith(2, null, @build_files)
@CompileManager.doCompile @request, @callback
@DraftModeManager.injectDraftMode = sinon.stub().callsArg(1)
@TikzManager.checkMainFile = sinon.stub().callsArg(3, false)
describe "normally", ->
beforeEach ->
@CompileManager.doCompile @request, @callback
it "should write the resources to disk", ->
@ResourceWriter.syncResourcesToDisk
.calledWith(@project_id, @resources, @compileDir)
.should.equal true
it "should write the resources to disk", ->
@ResourceWriter.syncResourcesToDisk
.calledWith(@request, @compileDir)
.should.equal true
it "should run LaTeX", ->
@LatexRunner.runLatex
.calledWith(@project_id, {
directory: @compileDir
mainFile: @rootResourcePath
compiler: @compiler
timeout: @timeout
})
.should.equal true
it "should run LaTeX", ->
@LatexRunner.runLatex
.calledWith("#{@project_id}-#{@user_id}", {
directory: @compileDir
mainFile: @rootResourcePath
compiler: @compiler
timeout: @timeout
image: @image
environment: @env
})
.should.equal true
it "should find the output files", ->
@OutputFileFinder.findOutputFiles
.calledWith(@resources, @compileDir)
.should.equal true
it "should find the output files", ->
@OutputFileFinder.findOutputFiles
.calledWith(@resources, @compileDir)
.should.equal true
it "should return the output files", ->
@callback.calledWith(null, @build_files).should.equal true
it "should return the output files", ->
@callback.calledWith(null, @build_files).should.equal true
it "should not inject draft mode by default", ->
@DraftModeManager.injectDraftMode.called.should.equal false
describe "with draft mode", ->
beforeEach ->
@request.draft = true
@CompileManager.doCompile @request, @callback
it "should inject the draft mode header", ->
@DraftModeManager.injectDraftMode
.calledWith(@compileDir + "/" + @rootResourcePath)
.should.equal true
describe "with a check option", ->
beforeEach ->
@request.check = "error"
@CompileManager.doCompile @request, @callback
it "should run chktex", ->
@LatexRunner.runLatex
.calledWith("#{@project_id}-#{@user_id}", {
directory: @compileDir
mainFile: @rootResourcePath
compiler: @compiler
timeout: @timeout
image: @image
environment: {'CHKTEX_OPTIONS': '-nall -e9 -e10 -w15 -w16', 'CHKTEX_EXIT_ON_ERROR':1, 'CHKTEX_ULIMIT_OPTIONS': '-t 5 -v 64000'}
})
.should.equal true
describe "with a knitr file and check options", ->
beforeEach ->
@request.rootResourcePath = "main.Rtex"
@request.check = "error"
@CompileManager.doCompile @request, @callback
it "should not run chktex", ->
@LatexRunner.runLatex
.calledWith("#{@project_id}-#{@user_id}", {
directory: @compileDir
mainFile: "main.Rtex"
compiler: @compiler
timeout: @timeout
image: @image
environment: @env
})
.should.equal true
describe "clearProject", ->
describe "succesfully", ->
beforeEach ->
@Settings.compileDir = "compiles"
@fs.lstat = sinon.stub().callsArgWith(1, null,{isDirectory: ()->true})
@proc = new EventEmitter()
@proc.stdout = new EventEmitter()
@proc.stderr = new EventEmitter()
@child_process.spawn = sinon.stub().returns(@proc)
@CompileManager.clearProject @project_id, @callback
@CompileManager.clearProject @project_id, @user_id, @callback
@proc.emit "close", 0
it "should remove the project directory", ->
@child_process.spawn
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}"])
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"])
.should.equal true
it "should call the callback", ->
@@ -97,17 +210,18 @@ describe "CompileManager", ->
describe "with a non-success status code", ->
beforeEach ->
@Settings.compileDir = "compiles"
@fs.lstat = sinon.stub().callsArgWith(1, null,{isDirectory: ()->true})
@proc = new EventEmitter()
@proc.stdout = new EventEmitter()
@proc.stderr = new EventEmitter()
@child_process.spawn = sinon.stub().returns(@proc)
@CompileManager.clearProject @project_id, @callback
@CompileManager.clearProject @project_id, @user_id, @callback
@proc.stderr.emit "data", @error = "oops"
@proc.emit "close", 1
it "should remove the project directory", ->
@child_process.spawn
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}"])
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"])
.should.equal true
it "should call the callback with an error from the stderr", ->
@@ -115,7 +229,7 @@ describe "CompileManager", ->
.calledWith(new Error())
.should.equal true
@callback.args[0][0].message.should.equal "rm -r #{@Settings.path.compilesDir}/#{@project_id} failed: #{@error}"
@callback.args[0][0].message.should.equal "rm -r #{@Settings.path.compilesDir}/#{@project_id}-#{@user_id} failed: #{@error}"
describe "syncing", ->
beforeEach ->
@@ -128,17 +242,18 @@ describe "CompileManager", ->
@column = 3
@file_name = "main.tex"
@child_process.execFile = sinon.stub()
@Settings.path.synctexBaseDir = (project_id) => "#{@Settings.path.compilesDir}/#{@project_id}"
@Settings.path.synctexBaseDir = (project_id) => "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
describe "syncFromCode", ->
beforeEach ->
@fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true})
@child_process.execFile.callsArgWith(3, null, @stdout = "NODE\t#{@page}\t#{@h}\t#{@v}\t#{@width}\t#{@height}\n", "")
@CompileManager.syncFromCode @project_id, @file_name, @line, @column, @callback
@CompileManager.syncFromCode @project_id, @user_id, @file_name, @line, @column, @callback
it "should execute the synctex binary", ->
bin_path = Path.resolve(__dirname + "/../../../bin/synctex")
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}/output.pdf"
file_path = "#{@Settings.path.compilesDir}/#{@project_id}/#{@file_name}"
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf"
file_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}"
@child_process.execFile
.calledWith(bin_path, ["code", synctex_path, file_path, @line, @column], timeout: 10000)
.should.equal true
@@ -156,12 +271,13 @@ describe "CompileManager", ->
describe "syncFromPdf", ->
beforeEach ->
@child_process.execFile.callsArgWith(3, null, @stdout = "NODE\t#{@Settings.path.compilesDir}/#{@project_id}/#{@file_name}\t#{@line}\t#{@column}\n", "")
@CompileManager.syncFromPdf @project_id, @page, @h, @v, @callback
@fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true})
@child_process.execFile.callsArgWith(3, null, @stdout = "NODE\t#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}\t#{@line}\t#{@column}\n", "")
@CompileManager.syncFromPdf @project_id, @user_id, @page, @h, @v, @callback
it "should execute the synctex binary", ->
bin_path = Path.resolve(__dirname + "/../../../bin/synctex")
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}/output.pdf"
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf"
@child_process.execFile
.calledWith(bin_path, ["pdf", synctex_path, @page, @h, @v], timeout: 10000)
.should.equal true
@@ -177,24 +293,25 @@ describe "CompileManager", ->
describe "wordcount", ->
beforeEach ->
@CommandRunner.run = sinon.stub().callsArg(4)
@fs.readFileSync = sinon.stub().returns @stdout = "Encoding: ascii\nWords in text: 2"
@CommandRunner.run = sinon.stub().callsArg(6)
@fs.readFile = sinon.stub().callsArgWith(2, null, @stdout = "Encoding: ascii\nWords in text: 2")
@callback = sinon.stub()
@project_id = "project-id-123"
@timeout = 10 * 1000
@file_name = "main.tex"
@Settings.path.compilesDir = "/local/compile/directory"
@image = "example.com/image"
@CompileManager.wordcount @project_id, @file_name, @callback
@CompileManager.wordcount @project_id, @user_id, @file_name, @image, @callback
it "should run the texcount command", ->
@directory = "#{@Settings.path.compilesDir}/#{@project_id}"
@directory = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@file_path = "$COMPILE_DIR/#{@file_name}"
@command =[ "texcount", "-inc", @file_path, "-out=" + @file_path + ".wc"]
@command =[ "texcount", "-nocol", "-inc", @file_path, "-out=" + @file_path + ".wc"]
@CommandRunner.run
.calledWith(@project_id, @command, @directory, @timeout)
.calledWith("#{@project_id}-#{@user_id}", @command, @directory, @image, @timeout, {})
.should.equal true
it "should call the callback with the parsed output", ->
@@ -208,5 +325,7 @@ describe "CompileManager", ->
elements: 0
mathInline: 0
mathDisplay: 0
errors: 0
messages: ""
})
.should.equal true

View File

@@ -0,0 +1,55 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/ContentTypeMapper'
describe 'ContentTypeMapper', ->
beforeEach ->
@ContentTypeMapper = SandboxedModule.require modulePath
describe 'map', ->
it 'should map .txt to text/plain', ->
content_type = @ContentTypeMapper.map('example.txt')
content_type.should.equal 'text/plain'
it 'should map .csv to text/csv', ->
content_type = @ContentTypeMapper.map('example.csv')
content_type.should.equal 'text/csv'
it 'should map .pdf to application/pdf', ->
content_type = @ContentTypeMapper.map('example.pdf')
content_type.should.equal 'application/pdf'
it 'should fall back to octet-stream', ->
content_type = @ContentTypeMapper.map('example.unknown')
content_type.should.equal 'application/octet-stream'
describe 'coercing web files to plain text', ->
it 'should map .js to plain text', ->
content_type = @ContentTypeMapper.map('example.js')
content_type.should.equal 'text/plain'
it 'should map .html to plain text', ->
content_type = @ContentTypeMapper.map('example.html')
content_type.should.equal 'text/plain'
it 'should map .css to plain text', ->
content_type = @ContentTypeMapper.map('example.css')
content_type.should.equal 'text/plain'
describe 'image files', ->
it 'should map .png to image/png', ->
content_type = @ContentTypeMapper.map('example.png')
content_type.should.equal 'image/png'
it 'should map .jpeg to image/jpeg', ->
content_type = @ContentTypeMapper.map('example.jpeg')
content_type.should.equal 'image/jpeg'
it 'should map .svg to text/plain to protect against XSS (SVG can execute JS)', ->
content_type = @ContentTypeMapper.map('example.svg')
content_type.should.equal 'text/plain'

View File

@@ -0,0 +1,61 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/DraftModeManager'
describe 'DraftModeManager', ->
beforeEach ->
@DraftModeManager = SandboxedModule.require modulePath, requires:
"fs": @fs = {}
"logger-sharelatex": @logger = {log: () ->}
describe "_injectDraftOption", ->
it "should add draft option into documentclass with existing options", ->
@DraftModeManager
._injectDraftOption('''
\\documentclass[a4paper,foo=bar]{article}
''')
.should.equal('''
\\documentclass[draft,a4paper,foo=bar]{article}
''')
it "should add draft option into documentclass with no options", ->
@DraftModeManager
._injectDraftOption('''
\\documentclass{article}
''')
.should.equal('''
\\documentclass[draft]{article}
''')
describe "injectDraftMode", ->
beforeEach ->
@filename = "/mock/filename.tex"
@callback = sinon.stub()
content = '''
\\documentclass{article}
\\begin{document}
Hello world
\\end{document}
'''
@fs.readFile = sinon.stub().callsArgWith(2, null, content)
@fs.writeFile = sinon.stub().callsArg(2)
@DraftModeManager.injectDraftMode @filename, @callback
it "should read the file", ->
@fs.readFile
.calledWith(@filename, "utf8")
.should.equal true
it "should write the modified file", ->
@fs.writeFile
.calledWith(@filename, """
\\documentclass[draft]{article}
\\begin{document}
Hello world
\\end{document}
""")
.should.equal true
it "should call the callback", ->
@callback.called.should.equal true

View File

@@ -19,12 +19,14 @@ describe "LatexRunner", ->
@directory = "/local/compile/directory"
@mainFile = "main-file.tex"
@compiler = "pdflatex"
@image = "example.com/image"
@callback = sinon.stub()
@project_id = "project-id-123"
@env = {'foo': '123'}
describe "runLatex", ->
beforeEach ->
@CommandRunner.run = sinon.stub().callsArg(4)
@CommandRunner.run = sinon.stub().callsArg(6)
describe "normally", ->
beforeEach ->
@@ -33,11 +35,13 @@ describe "LatexRunner", ->
mainFile: @mainFile
compiler: @compiler
timeout: @timeout = 42000
image: @image
environment: @env
@callback
it "should run the latex command", ->
@CommandRunner.run
.calledWith(@project_id, sinon.match.any, @directory, @timeout)
.calledWith(@project_id, sinon.match.any, @directory, @image, @timeout, @env)
.should.equal true
describe "with an .Rtex main file", ->
@@ -46,6 +50,7 @@ describe "LatexRunner", ->
directory: @directory
mainFile: "main-file.Rtex"
compiler: @compiler
image: @image
timeout: @timeout = 42000
@callback

View File

@@ -0,0 +1,54 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/LockManager'
Path = require "path"
Errors = require "../../../app/js/Errors"
describe "LockManager", ->
beforeEach ->
@LockManager = SandboxedModule.require modulePath, requires:
"settings-sharelatex": {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() }
"lockfile": @Lockfile = {}
@lockFile = "/local/compile/directory/.project-lock"
describe "runWithLock", ->
beforeEach ->
@runner = sinon.stub().callsArgWith(0, null, "foo", "bar")
@callback = sinon.stub()
describe "normally", ->
beforeEach ->
@Lockfile.lock = sinon.stub().callsArgWith(2, null)
@Lockfile.unlock = sinon.stub().callsArgWith(1, null)
@LockManager.runWithLock @lockFile, @runner, @callback
it "should run the compile", ->
@runner
.calledWith()
.should.equal true
it "should call the callback with the response from the compile", ->
@callback
.calledWithExactly(null, "foo", "bar")
.should.equal true
describe "when the project is locked", ->
beforeEach ->
@error = new Error()
@error.code = "EEXIST"
@Lockfile.lock = sinon.stub().callsArgWith(2,@error)
@Lockfile.unlock = sinon.stub().callsArgWith(1, null)
@LockManager.runWithLock @lockFile, @runner, @callback
it "should not run the compile", ->
@runner
.called
.should.equal false
it "should return an error", ->
error = new Errors.AlreadyCompilingError()
@callback
.calledWithExactly(error)
.should.equal true

View File

@@ -0,0 +1,103 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/OutputFileOptimiser'
path = require "path"
expect = require("chai").expect
EventEmitter = require("events").EventEmitter
describe "OutputFileOptimiser", ->
beforeEach ->
@OutputFileOptimiser = SandboxedModule.require modulePath, requires:
"fs": @fs = {}
"path": @Path = {}
"child_process": spawn: @spawn = sinon.stub()
"logger-sharelatex": { log: sinon.stub(), warn: sinon.stub() }
"./Metrics" : {}
@directory = "/test/dir"
@callback = sinon.stub()
describe "optimiseFile", ->
beforeEach ->
@src = "./output.pdf"
@dst = "./output.pdf"
describe "when the file is not a pdf file", ->
beforeEach (done)->
@src = "./output.log"
@OutputFileOptimiser.checkIfPDFIsOptimised = sinon.stub().callsArgWith(1, null, false)
@OutputFileOptimiser.optimisePDF = sinon.stub().callsArgWith(2, null)
@OutputFileOptimiser.optimiseFile @src, @dst, done
it "should not check if the file is optimised", ->
@OutputFileOptimiser.checkIfPDFIsOptimised.calledWith(@src).should.equal false
it "should not optimise the file", ->
@OutputFileOptimiser.optimisePDF.calledWith(@src, @dst).should.equal false
describe "when the pdf file is not optimised", ->
beforeEach (done) ->
@OutputFileOptimiser.checkIfPDFIsOptimised = sinon.stub().callsArgWith(1, null, false)
@OutputFileOptimiser.optimisePDF = sinon.stub().callsArgWith(2, null)
@OutputFileOptimiser.optimiseFile @src, @dst, done
it "should check if the pdf is optimised", ->
@OutputFileOptimiser.checkIfPDFIsOptimised.calledWith(@src).should.equal true
it "should optimise the pdf", ->
@OutputFileOptimiser.optimisePDF.calledWith(@src, @dst).should.equal true
describe "when the pdf file is optimised", ->
beforeEach (done) ->
@OutputFileOptimiser.checkIfPDFIsOptimised = sinon.stub().callsArgWith(1, null, true)
@OutputFileOptimiser.optimisePDF = sinon.stub().callsArgWith(2, null)
@OutputFileOptimiser.optimiseFile @src, @dst, done
it "should check if the pdf is optimised", ->
@OutputFileOptimiser.checkIfPDFIsOptimised.calledWith(@src).should.equal true
it "should not optimise the pdf", ->
@OutputFileOptimiser.optimisePDF.calledWith(@src, @dst).should.equal false
describe "checkIfPDFISOptimised", ->
beforeEach () ->
@callback = sinon.stub()
@fd = 1234
@fs.open = sinon.stub().yields(null, @fd)
@fs.read = sinon.stub().withArgs(@fd).yields(null, 100, new Buffer("hello /Linearized 1"))
@fs.close = sinon.stub().withArgs(@fd).yields(null)
@OutputFileOptimiser.checkIfPDFIsOptimised @src, @callback
describe "for a linearised file", ->
beforeEach () ->
@fs.read = sinon.stub().withArgs(@fd).yields(null, 100, new Buffer("hello /Linearized 1"))
@OutputFileOptimiser.checkIfPDFIsOptimised @src, @callback
it "should open the file", ->
@fs.open.calledWith(@src, "r").should.equal true
it "should read the header", ->
@fs.read.calledWith(@fd).should.equal true
it "should close the file", ->
@fs.close.calledWith(@fd).should.equal true
it "should call the callback with a true result", ->
@callback.calledWith(null, true).should.equal true
describe "for an unlinearised file", ->
beforeEach () ->
@fs.read = sinon.stub().withArgs(@fd).yields(null, 100, new Buffer("hello not linearized 1"))
@OutputFileOptimiser.checkIfPDFIsOptimised @src, @callback
it "should open the file", ->
@fs.open.calledWith(@src, "r").should.equal true
it "should read the header", ->
@fs.read.calledWith(@fd).should.equal true
it "should close the file", ->
@fs.close.calledWith(@fd).should.equal true
it "should call the callback with a false result", ->
@callback.calledWith(null, false).should.equal true

View File

@@ -13,6 +13,7 @@ describe "ProjectPersistenceManager", ->
"./db": @db = {}
@callback = sinon.stub()
@project_id = "project-id-123"
@user_id = "1234"
describe "clearExpiredProjects", ->
beforeEach ->
@@ -21,12 +22,13 @@ describe "ProjectPersistenceManager", ->
"project-id-2"
]
@ProjectPersistenceManager._findExpiredProjectIds = sinon.stub().callsArgWith(0, null, @project_ids)
@ProjectPersistenceManager.clearProject = sinon.stub().callsArg(1)
@ProjectPersistenceManager.clearProjectFromCache = sinon.stub().callsArg(1)
@CompileManager.clearExpiredProjects = sinon.stub().callsArg(1)
@ProjectPersistenceManager.clearExpiredProjects @callback
it "should clear each expired project", ->
for project_id in @project_ids
@ProjectPersistenceManager.clearProject
@ProjectPersistenceManager.clearProjectFromCache
.calledWith(project_id)
.should.equal true
@@ -37,8 +39,8 @@ describe "ProjectPersistenceManager", ->
beforeEach ->
@ProjectPersistenceManager._clearProjectFromDatabase = sinon.stub().callsArg(1)
@UrlCache.clearProject = sinon.stub().callsArg(1)
@CompileManager.clearProject = sinon.stub().callsArg(1)
@ProjectPersistenceManager.clearProject @project_id, @callback
@CompileManager.clearProject = sinon.stub().callsArg(2)
@ProjectPersistenceManager.clearProject @project_id, @user_id, @callback
it "should clear the project from the database", ->
@ProjectPersistenceManager._clearProjectFromDatabase
@@ -52,7 +54,7 @@ describe "ProjectPersistenceManager", ->
it "should clear the project compile folder", ->
@CompileManager.clearProject
.calledWith(@project_id)
.calledWith(@project_id, @user_id)
.should.equal true
it "should call the callback", ->

View File

@@ -206,11 +206,49 @@ describe "RequestParser", ->
describe "with a root resource path that needs escaping", ->
beforeEach ->
@validRequest.compile.rootResourcePath = "`rm -rf foo`.tex"
@badPath = "`rm -rf foo`.tex"
@goodPath = "rm -rf foo.tex"
@validRequest.compile.rootResourcePath = @badPath
@validRequest.compile.resources.push {
path: @badPath
date: "12:00 01/02/03"
content: "Hello world"
}
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return the escaped resource", ->
@data.rootResourcePath.should.equal "rm -rf foo.tex"
@data.rootResourcePath.should.equal @goodPath
it "should also escape the resource path", ->
@data.resources[0].path.should.equal @goodPath
describe "with a root resource path that has a relative path", ->
beforeEach ->
@validRequest.compile.rootResourcePath = "foo/../../bar.tex"
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return an error", ->
@callback.calledWith("relative path in root resource")
.should.equal true
describe "with a root resource path that has unescaped + relative path", ->
beforeEach ->
@validRequest.compile.rootResourcePath = "foo/#../bar.tex"
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return an error", ->
@callback.calledWith("relative path in root resource")
.should.equal true
describe "with an unknown syncType", ->
beforeEach ->
@validRequest.compile.options.syncType = "unexpected"
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return an error", ->
@callback.calledWith("syncType attribute should be one of: full, incremental")
.should.equal true

View File

@@ -0,0 +1,109 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
should = require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/ResourceStateManager'
Path = require "path"
Errors = require "../../../app/js/Errors"
describe "ResourceStateManager", ->
beforeEach ->
@ResourceStateManager = SandboxedModule.require modulePath, requires:
"fs": @fs = {}
"logger-sharelatex": {log: sinon.stub(), err: sinon.stub()}
"./SafeReader": @SafeReader = {}
@basePath = "/path/to/write/files/to"
@resources = [
{path: "resource-1-mock"}
{path: "resource-2-mock"}
{path: "resource-3-mock"}
]
@state = "1234567890"
@resourceFileName = "#{@basePath}/.project-sync-state"
@resourceFileContents = "#{@resources[0].path}\n#{@resources[1].path}\n#{@resources[2].path}\nstateHash:#{@state}"
@callback = sinon.stub()
describe "saveProjectState", ->
beforeEach ->
@fs.writeFile = sinon.stub().callsArg(2)
describe "when the state is specified", ->
beforeEach ->
@ResourceStateManager.saveProjectState(@state, @resources, @basePath, @callback)
it "should write the resource list to disk", ->
@fs.writeFile
.calledWith(@resourceFileName, @resourceFileContents)
.should.equal true
it "should call the callback", ->
@callback.called.should.equal true
describe "when the state is undefined", ->
beforeEach ->
@state = undefined
@fs.unlink = sinon.stub().callsArg(1)
@ResourceStateManager.saveProjectState(@state, @resources, @basePath, @callback)
it "should unlink the resource file", ->
@fs.unlink
.calledWith(@resourceFileName)
.should.equal true
it "should not write the resource list to disk", ->
@fs.writeFile.called.should.equal false
it "should call the callback", ->
@callback.called.should.equal true
describe "checkProjectStateMatches", ->
describe "when the state matches", ->
beforeEach ->
@SafeReader.readFile = sinon.stub().callsArgWith(3, null, @resourceFileContents)
@ResourceStateManager.checkProjectStateMatches(@state, @basePath, @callback)
it "should read the resource file", ->
@SafeReader.readFile
.calledWith(@resourceFileName)
.should.equal true
it "should call the callback with the results", ->
@callback.calledWithMatch(null, @resources).should.equal true
describe "when the state does not match", ->
beforeEach ->
@SafeReader.readFile = sinon.stub().callsArgWith(3, null, @resourceFileContents)
@ResourceStateManager.checkProjectStateMatches("not-the-original-state", @basePath, @callback)
it "should call the callback with an error", ->
error = new Errors.FilesOutOfSyncError("invalid state for incremental update")
@callback.calledWith(error).should.equal true
describe "checkResourceFiles", ->
describe "when all the files are present", ->
beforeEach ->
@allFiles = [ @resources[0].path, @resources[1].path, @resources[2].path]
@ResourceStateManager.checkResourceFiles(@resources, @allFiles, @basePath, @callback)
it "should call the callback", ->
@callback.calledWithExactly().should.equal true
describe "when there is a missing file", ->
beforeEach ->
@allFiles = [ @resources[0].path, @resources[1].path]
@fs.stat = sinon.stub().callsArgWith(1, new Error())
@ResourceStateManager.checkResourceFiles(@resources, @allFiles, @basePath, @callback)
it "should call the callback with an error", ->
error = new Errors.FilesOutOfSyncError("resource files missing in incremental update")
@callback.calledWith(error).should.equal true
describe "when a resource contains a relative path", ->
beforeEach ->
@resources[0].path = "../foo/bar.tex"
@allFiles = [ @resources[0].path, @resources[1].path, @resources[2].path]
@ResourceStateManager.checkResourceFiles(@resources, @allFiles, @basePath, @callback)
it "should call the callback with an error", ->
@callback.calledWith(new Error("relative path in resource file list")).should.equal true

View File

@@ -7,11 +7,15 @@ path = require "path"
describe "ResourceWriter", ->
beforeEach ->
@ResourceWriter = SandboxedModule.require modulePath, requires:
"fs": @fs = {}
"fs": @fs =
mkdir: sinon.stub().callsArg(1)
unlink: sinon.stub().callsArg(1)
"./ResourceStateManager": @ResourceStateManager = {}
"wrench": @wrench = {}
"./UrlCache" : @UrlCache = {}
"mkdirp" : @mkdirp = sinon.stub().callsArg(1)
"./OutputFileFinder": @OutputFileFinder = {}
"logger-sharelatex": {log: sinon.stub(), err: sinon.stub()}
"./Metrics": @Metrics =
Timer: class Timer
done: sinon.stub()
@@ -19,7 +23,7 @@ describe "ResourceWriter", ->
@basePath = "/path/to/write/files/to"
@callback = sinon.stub()
describe "syncResourcesToDisk", ->
describe "syncResourcesToDisk on a full request", ->
beforeEach ->
@resources = [
"resource-1-mock"
@@ -28,7 +32,12 @@ describe "ResourceWriter", ->
]
@ResourceWriter._writeResourceToDisk = sinon.stub().callsArg(3)
@ResourceWriter._removeExtraneousFiles = sinon.stub().callsArg(2)
@ResourceWriter.syncResourcesToDisk(@project_id, @resources, @basePath, @callback)
@ResourceStateManager.saveProjectState = sinon.stub().callsArg(3)
@ResourceWriter.syncResourcesToDisk({
project_id: @project_id
syncState: @syncState = "0123456789abcdef"
resources: @resources
}, @basePath, @callback)
it "should remove old files", ->
@ResourceWriter._removeExtraneousFiles
@@ -41,9 +50,77 @@ describe "ResourceWriter", ->
.calledWith(@project_id, resource, @basePath)
.should.equal true
it "should store the sync state and resource list", ->
@ResourceStateManager.saveProjectState
.calledWith(@syncState, @resources, @basePath)
.should.equal true
it "should call the callback", ->
@callback.called.should.equal true
describe "syncResourcesToDisk on an incremental update", ->
beforeEach ->
@resources = [
"resource-1-mock"
]
@ResourceWriter._writeResourceToDisk = sinon.stub().callsArg(3)
@ResourceWriter._removeExtraneousFiles = sinon.stub().callsArgWith(2, null, @outputFiles = [], @allFiles = [])
@ResourceStateManager.checkProjectStateMatches = sinon.stub().callsArgWith(2, null, @resources)
@ResourceStateManager.saveProjectState = sinon.stub().callsArg(3)
@ResourceStateManager.checkResourceFiles = sinon.stub().callsArg(3)
@ResourceWriter.syncResourcesToDisk({
project_id: @project_id,
syncType: "incremental",
syncState: @syncState = "1234567890abcdef",
resources: @resources
}, @basePath, @callback)
it "should check the sync state matches", ->
@ResourceStateManager.checkProjectStateMatches
.calledWith(@syncState, @basePath)
.should.equal true
it "should remove old files", ->
@ResourceWriter._removeExtraneousFiles
.calledWith(@resources, @basePath)
.should.equal true
it "should check each resource exists", ->
@ResourceStateManager.checkResourceFiles
.calledWith(@resources, @allFiles, @basePath)
.should.equal true
it "should write each resource to disk", ->
for resource in @resources
@ResourceWriter._writeResourceToDisk
.calledWith(@project_id, resource, @basePath)
.should.equal true
it "should call the callback", ->
@callback.called.should.equal true
describe "syncResourcesToDisk on an incremental update when the state does not match", ->
beforeEach ->
@resources = [
"resource-1-mock"
]
@ResourceStateManager.checkProjectStateMatches = sinon.stub().callsArgWith(2, @error = new Error())
@ResourceWriter.syncResourcesToDisk({
project_id: @project_id,
syncType: "incremental",
syncState: @syncState = "1234567890abcdef",
resources: @resources
}, @basePath, @callback)
it "should check whether the sync state matches", ->
@ResourceStateManager.checkProjectStateMatches
.calledWith(@syncState, @basePath)
.should.equal true
it "should call the callback with an error", ->
@callback.calledWith(@error).should.equal true
describe "_removeExtraneousFiles", ->
beforeEach ->
@output_files = [{
@@ -55,6 +132,8 @@ describe "ResourceWriter", ->
}, {
path: "extra.aux"
type: "aux"
}, {
path: "cache/_chunk1"
}]
@resources = "mock-resources"
@OutputFileFinder.findOutputFiles = sinon.stub().callsArgWith(2, null, @output_files)
@@ -80,6 +159,11 @@ describe "ResourceWriter", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "extra.aux"))
.should.equal false
it "should not delete the knitr cache file", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "cache/_chunk1"))
.should.equal false
it "should call the callback", ->
@callback.called.should.equal true
@@ -150,6 +234,27 @@ describe "ResourceWriter", ->
.calledWith(new Error("resource path is outside root directory"))
.should.equal true
describe "checkPath", ->
describe "with a valid path", ->
beforeEach ->
@ResourceWriter.checkPath("foo", "bar", @callback)
it "should return the joined path", ->
@callback.calledWith(null, "foo/bar")
.should.equal true
describe "with an invalid path", ->
beforeEach ->
@ResourceWriter.checkPath("foo", "baz/../../bar", @callback)
it "should return an error", ->
@callback.calledWith(new Error("resource path is outside root directory"))
.should.equal true
describe "with another invalid path matching on a prefix", ->
beforeEach ->
@ResourceWriter.checkPath("foo", "../foobar/baz", @callback)
it "should return an error", ->
@callback.calledWith(new Error("resource path is outside root directory"))
.should.equal true

View File

@@ -0,0 +1,101 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/TikzManager'
describe 'TikzManager', ->
beforeEach ->
@TikzManager = SandboxedModule.require modulePath, requires:
"./ResourceWriter": @ResourceWriter = {}
"./SafeReader": @SafeReader = {}
"fs": @fs = {}
"logger-sharelatex": @logger = {log: () ->}
describe "checkMainFile", ->
beforeEach ->
@compileDir = "compile-dir"
@mainFile = "main.tex"
@callback = sinon.stub()
describe "if there is already an output.tex file in the resources", ->
beforeEach ->
@resources = [{path:"main.tex"},{path:"output.tex"}]
@TikzManager.checkMainFile @compileDir, @mainFile, @resources, @callback
it "should call the callback with false ", ->
@callback.calledWithExactly(null, false)
.should.equal true
describe "if there is no output.tex file in the resources", ->
beforeEach ->
@resources = [{path:"main.tex"}]
@ResourceWriter.checkPath = sinon.stub()
.withArgs(@compileDir, @mainFile)
.callsArgWith(2, null, "#{@compileDir}/#{@mainFile}")
describe "and the main file contains tikzexternalize", ->
beforeEach ->
@SafeReader.readFile = sinon.stub()
.withArgs("#{@compileDir}/#{@mainFile}")
.callsArgWith(3, null, "hello \\tikzexternalize")
@TikzManager.checkMainFile @compileDir, @mainFile, @resources, @callback
it "should look at the file on disk", ->
@SafeReader.readFile
.calledWith("#{@compileDir}/#{@mainFile}")
.should.equal true
it "should call the callback with true ", ->
@callback.calledWithExactly(null, true)
.should.equal true
describe "and the main file does not contain tikzexternalize", ->
beforeEach ->
@SafeReader.readFile = sinon.stub()
.withArgs("#{@compileDir}/#{@mainFile}")
.callsArgWith(3, null, "hello")
@TikzManager.checkMainFile @compileDir, @mainFile, @resources, @callback
it "should look at the file on disk", ->
@SafeReader.readFile
.calledWith("#{@compileDir}/#{@mainFile}")
.should.equal true
it "should call the callback with false", ->
@callback.calledWithExactly(null, false)
.should.equal true
describe "injectOutputFile", ->
beforeEach ->
@rootDir = "/mock"
@filename = "filename.tex"
@callback = sinon.stub()
@content = '''
\\documentclass{article}
\\usepackage{tikz}
\\tikzexternalize
\\begin{document}
Hello world
\\end{document}
'''
@fs.readFile = sinon.stub().callsArgWith(2, null, @content)
@fs.writeFile = sinon.stub().callsArg(3)
@ResourceWriter.checkPath = sinon.stub().callsArgWith(2, null, "#{@rootDir}/#{@filename}")
@TikzManager.injectOutputFile @rootDir, @filename, @callback
it "sould check the path", ->
@ResourceWriter.checkPath.calledWith(@rootDir, @filename)
.should.equal true
it "should read the file", ->
@fs.readFile
.calledWith("#{@rootDir}/#{@filename}", "utf8")
.should.equal true
it "should write out the same file as output.tex", ->
@fs.writeFile
.calledWith("#{@rootDir}/output.tex", @content, {flag: 'wx'})
.should.equal true
it "should call the callback", ->
@callback.called.should.equal true