491 Commits

Author SHA1 Message Date
Christopher Hoskin
a47237ccb7 Explicitly start build immediately 2019-10-25 16:54:36 +01:00
Christopher Hoskin
ab82140128 Try some parallelism 2019-10-25 16:42:33 +01:00
Christopher Hoskin
4d81b2ca53 Another go at printing out times 2019-10-25 11:55:57 +01:00
Christopher Hoskin
c1eca448c5 Need bash for pipefail 2019-10-25 11:21:24 +01:00
Christopher Hoskin
d04df4ed75 Try to get more times 2019-10-25 10:42:21 +01:00
Christopher Hoskin
d9f487efc4 Remove bin/install_texlive_gce.sh 2019-10-25 10:10:14 +01:00
Christopher Hoskin
02b5cc8efd Revert "Remove TEXLIVE_IMAGE env var - expect failure"
This reverts commit 3ab407b91a.
2019-10-24 17:08:43 +01:00
Christopher Hoskin
f3c6756294 Try to narrow down where the delay is 2019-10-24 14:52:28 +01:00
Christopher Hoskin
3ab407b91a Remove TEXLIVE_IMAGE env var - expect failure 2019-10-24 12:13:29 +01:00
Christopher Hoskin
0b40c8f79d Try a different texlive image 2019-10-24 10:58:04 +01:00
Christopher Hoskin
882732d6a5 Try build on bigger machine 2019-10-23 21:42:09 +01:00
Christopher Hoskin
d934f96370 Increase timeout 2019-10-23 19:03:15 +01:00
Christopher Hoskin
0f84c47bbe Try again with gcr texlive image 2019-10-23 17:08:13 +01:00
Christopher Hoskin
fdd87d77cc Try env pass-through 2019-10-23 16:35:59 +01:00
Christopher Hoskin
7ebc9b43a1 Bump buildscript to 1.1.22 2019-10-23 16:28:23 +01:00
Christopher Hoskin
4c4dd64ca6 Try again 2019-10-23 14:31:15 +01:00
Christopher Hoskin
693b9e6193 Use gcr.io texlive image 2019-10-23 14:26:37 +01:00
Christopher Hoskin
65d416ee10 Add quay.io/sharelatex/texlive-full:2017.1 as a custom builder 2019-10-23 12:03:11 +01:00
Christopher Hoskin
8f70dbd67b Start trying to figure out CLSI cloudbuild 2019-10-23 12:00:15 +01:00
Miguel Serrano
a62ff6e248 Merge pull request #131 from yuantailing/fix-compiler-manager
Fix synctex for LocalCommandRunner
2019-10-08 12:48:13 +02:00
Tailing Yuan
481a49a587 fix CompileManager and LocalCommandRunner 2019-10-04 23:02:03 +08:00
Shane Kilkelly
2675fa033e Merge pull request #128 from overleaf/sk-dep-upgrades-2
Update logger, metrics
2019-07-11 12:51:16 +01:00
Shane Kilkelly
dc6af8799f update logger and metrics 2019-06-18 16:29:20 +01:00
Shane Kilkelly
61bed0da2b Merge pull request #126 from overleaf/sk-increase-hard-timeout
Increase the hard-timeout to 10 minutes.
2019-06-10 09:44:48 +01:00
Shane Kilkelly
4f6ef61626 Increase the hard-timeout to 10 minutes.
In practice most projects will still be limited to five minutes,
but this allows us to bump up the limit for some projects,
especially legacy v1 projects that have been imported to v2
2019-06-06 16:39:16 +01:00
Brian Gough
ada07ad2c3 Merge pull request #120 from das7pad/hotfix/docker-group
[docker] add support for a different docker group id on the docker host
2019-05-16 14:04:27 +01:00
Brian Gough
bc530c70e2 Merge pull request #119 from overleaf/bg-increase-acceptance-test-timeout
increase timeout for long-running acceptance tests
2019-05-16 09:17:26 +01:00
Michael Mazour
db00288bb9 Merge pull request #125 from overleaf/mm-flags-in-request
Add flags option to request JSON
2019-05-15 14:06:47 +01:00
Michael Mazour
663ec88718 Add flags option to request JSON
Adds a `flags` parameter to the request JSON, appearing under the `compile.options` key (alongside such stalwarts as `compiler`, `timeout`, etc.).

This is primarily to support `-file-line-error` as an option, but could have other uses as well.

`flags` should be an array of strings, or absent. If supplied, the listed arguments are added to the base latexmk command.
2019-05-14 16:24:34 +01:00
Tim Alby
03047f45af update Git URL in Jenkinsfile 2019-05-07 18:31:54 +02:00
Timothée Alby
11cf8a98fa Update README.md 2019-05-07 16:41:17 +01:00
Christopher Hoskin
d2c2629ef5 Bump buildscripts from 1.1.11 to 1.1.20 2019-05-03 10:29:38 +01:00
Jakob Ackermann
adfeffd254 [docker] add support for a different docker group id on the docker host
Signed-off-by: Jakob Ackermann <das7pad@outlook.com>
2019-04-23 01:53:40 +02:00
Brian Gough
bd42fe5776 increase timeout for long-running acceptance tests 2019-04-01 09:42:54 +01:00
Christopher Hoskin
3200161308 Merge pull request #116 from sharelatex/csh-formalise-node-10.15
Formalise node 10.15 update
2019-03-28 11:59:08 +00:00
Christopher Hoskin
9cb14660d4 Formalise node 10.15 update 2019-03-26 11:50:59 +00:00
Henry Oswald
31153c479c change console.log for logger.log 2019-03-22 20:42:26 +00:00
Christopher Hoskin
f422bb8011 Merge pull request #113 from sharelatex/ho-osx-epoll
add epoll_pwait to secomp profile
2019-03-04 14:57:01 +00:00
Christopher Hoskin
25c4c349d7 Merge pull request #115 from sharelatex/csh-issue-204-clsi-log-stackdriver
Bump logger to v1.6.0
2019-03-04 14:56:17 +00:00
Christopher Hoskin
e2377e1c1c Bump logger to v1.6.0 2019-03-04 12:05:28 +00:00
Brian Gough
1899d27732 increase acceptance test timeout to 1 minute 2019-02-22 13:58:12 +00:00
Brian Gough
9bf3795ceb Merge pull request #114 from sharelatex/bg-avoid-text-html-content-type-in-responses
use explicit json content-type to avoid security issues with text/html
2019-02-22 11:35:24 +00:00
Brian Gough
d20856f799 use explicit json content-type to avoid security issues with text/html 2019-02-12 16:54:59 +00:00
Henry Oswald
12fee9e4df add epoll_pwait to secomp profile
Last year golang changed from epoll_wait to epoll_pwait https://github.com/golang/go/issues/23750

This causes golang panic errors on mac when running secomp secure compiles using docker 18.09.1. It may start to become a problem on linux where we are running on 17.03.2-ce in production.
2019-01-24 12:30:37 +00:00
Christopher Hoskin
ddaa944aa3 Merge pull request #112 from sharelatex/csh-issue-1309-node-10.15.0
Upgrade to Node 10 - CLSI
2019-01-17 09:50:19 +00:00
Christopher Hoskin
a194d7ad05 Fix broken spacing 2019-01-16 15:12:23 +00:00
Christopher Hoskin
4c8b619ee8 Switch to node 10 2019-01-16 15:11:49 +00:00
Christopher Hoskin
4bd67d5e7e Merge pull request #111 from sharelatex/csh-issue-1338-bulk-upgrade
Services bulk upgrade - CLSI
2019-01-15 12:28:35 +00:00
Christopher Hoskin
c269c308ef Correctly pass command with arguments to runuser 2019-01-15 11:29:04 +00:00
Christopher Hoskin
e12ffdd535 Pass arguments to node, not to runuser 2019-01-15 11:12:21 +00:00
Christopher Hoskin
82afad7afc Add **/*.map to .gitignore 2019-01-11 12:11:36 +00:00
Christopher Hoskin
2fceac6ac8 Remove grunt 2019-01-11 12:06:45 +00:00
Christopher Hoskin
d4e9aca9e2 Bump buildscript to 1.1.11 2019-01-11 11:52:10 +00:00
Christopher Hoskin
5d2eb129e8 Init metrics at top of app.coffee 2019-01-11 10:19:47 +00:00
Christopher Hoskin
b52a8b2aa2 Bump logger to v1.5.9 and settings to v1.1.0 2019-01-11 10:18:37 +00:00
Henry Oswald
6fbdcd76d0 Merge pull request #110 from sharelatex/ho-increase-compile-size
pull clsi compile size limit into setting and bump to 7mb
2019-01-08 13:30:00 +00:00
Henry Oswald
541dac11cb pull clsi compile size limit into setting and bump to 7mb 2019-01-08 12:56:16 +00:00
Christopher Hoskin
ee7947f54d Merge pull request #107 from sharelatex/csh-issue-1309-node-6.15
Bump node to 6.15
2018-12-18 11:16:25 +00:00
Christopher Hoskin
984474ee11 Add npm-shrinkwrap.json 2018-12-18 11:03:06 +00:00
Christopher Hoskin
be855805c9 package-lock not supported until npm 5 2018-12-17 15:31:45 +00:00
Christopher Hoskin
2d023a3b03 Bump node to 6.15.1 2018-12-17 15:29:56 +00:00
Christopher Hoskin
1894e8ad5d Merge pull request #106 from sharelatex/csh-prom-metrics
Use promethus metrics
2018-12-14 10:21:40 +00:00
Christopher Hoskin
9507f0f80f Revert "Bump buildscript to 1.1.10"
This reverts commit 38874f9169.
2018-12-13 17:37:16 +00:00
Christopher Hoskin
19078fe866 Revert "Initialise metrics at begining of app"
This reverts commit 855f26c520.
2018-12-13 17:33:45 +00:00
Christopher Hoskin
38874f9169 Bump buildscript to 1.1.10 2018-12-13 14:45:40 +00:00
Christopher Hoskin
855f26c520 Initialise metrics at begining of app 2018-12-13 14:24:44 +00:00
Christopher Hoskin
8401bbdc26 Bump metrics-sharelatex to v2.0.12 2018-12-13 14:21:32 +00:00
Christopher Hoskin
71181243b3 Bump metrics-sharelatex.git to v2.0.11 2018-12-13 14:15:19 +00:00
Christopher Hoskin
0b4ae6ef8d Use metrics which labels host in timing 2018-12-11 12:11:53 +00:00
Christopher Hoskin
747c73fdad Merge pull request #105 from sharelatex/csh-204
Bump metrics to 2.0.4
2018-12-03 15:12:16 +00:00
Christopher Hoskin
1c1610a0bc Bump metrics to 2.0.4 2018-12-03 15:10:39 +00:00
Christopher Hoskin
434e819d23 Merge pull request #104 from sharelatex/csh-stackdriver
Add Prometheus Metrics to CLSIs
2018-12-03 11:45:02 +00:00
Christopher Hoskin
f92e626647 Inject routes after app defined 2018-11-29 15:49:12 +00:00
Christopher Hoskin
6159aff001 Inject metrics 2018-11-29 14:30:00 +00:00
Christopher Hoskin
49d5ad711a Bump metrics to v2.0.3 - specify tag correctly this time 2018-11-29 10:24:25 +00:00
Christopher Hoskin
bcdac34a0b Use v1.9.0 of metrics to get Prometheus support 2018-11-29 10:10:48 +00:00
Christopher Hoskin
25cb54d1d7 Merge branch 'master' into csh-stackdriver 2018-11-29 10:06:48 +00:00
Henry Oswald
75e77a3991 Merge pull request #103 from sharelatex/ho-mute-sentry-errors
have failed compiles warn rather than be an error
2018-11-28 22:35:51 +09:00
Henry Oswald
49f3b7d54f have failed compiles warn rather than be an error 2018-11-23 15:10:35 +00:00
Christopher Hoskin
f1ab938bab Merge pull request #102 from sharelatex/csh-expand-abbr
Expand CLSI to Common LaTeX Service Interface on first use
2018-11-22 09:52:30 +00:00
Christopher Hoskin
a18d49562c Expand CLSI to Common LaTeX Service Interface on first use 2018-11-22 09:13:23 +00:00
Christopher Hoskin
d3039a52f3 First attempt to use my stackdriver branch 2018-11-07 08:29:34 +00:00
Christopher Hoskin
7e07b8b4a7 Merge pull request #101 from sharelatex/csh-documentation
Add some notes on the CLSIs
2018-10-23 14:43:06 +01:00
Christopher Hoskin
473efdae70 Merge branch 'csh-documentation' of github.com:sharelatex/clsi-sharelatex into csh-documentation 2018-10-22 17:55:47 +01:00
Christopher Hoskin
3aa160b0e7 Make REAME more generic 2018-10-22 17:52:38 +01:00
Christopher Hoskin
114e4f7043 Fix indenting 2018-10-22 16:03:50 +01:00
Christopher Hoskin
cd0a71caba Add some notes on the CLSIs 2018-10-22 16:01:17 +01:00
Brian Gough
96d6fb3404 Merge pull request #100 from sharelatex/bg-create-main-file-for-pstool
use TikzManager to create main file for pstool package
2018-10-15 11:05:23 +01:00
Brian Gough
1481b4fe50 fix exception when content undefined in TikzManager 2018-10-15 10:01:52 +01:00
Brian Gough
3aad472a83 improve log message 2018-10-12 10:49:54 +01:00
Brian Gough
49ddcee0c6 use TikzManager to create main file for pstool package 2018-10-10 16:13:20 +01:00
Brian Gough
6d1545a40e Merge pull request #99 from sharelatex/bg-cache-tikz-minted-and-markdown-outputs
extend caching for tikz, minted and markdown files
2018-10-08 09:22:20 +01:00
Brian Gough
9ce7bfa8ab extend caching for tikz, minted and markdown files 2018-10-04 16:56:48 +01:00
Henry Oswald
7c4c8a9e44 remove debugging get settings function 2018-09-14 10:26:40 +01:00
Brian Gough
90436933da Merge pull request #96 from sharelatex/bg-cache-eps-to-pdf-converted-files
cache pdf files generated by epstopdf
2018-09-11 13:31:26 +01:00
Henry Oswald
77abf19f6b Merge pull request #86 from sharelatex/ho-dockerise
Dockerised clsi
2018-09-11 12:36:11 +01:00
Henry Oswald
a781c7f600 change timeout test latex code 2018-09-11 11:34:25 +01:00
Henry Oswald
b07b7a84be fix unit tests 2018-09-11 10:21:37 +01:00
Henry Oswald
58b4de905c Merge branch 'master' into ho-dockerise 2018-09-11 10:02:24 +01:00
Henry Oswald
5f9fb85613 bump wordcount timeouts, taken from 82b996b145 2018-09-11 09:55:10 +01:00
Henry Oswald
d3bb863d0a improve synctex logging 2018-09-11 09:51:20 +01:00
Brian Gough
00ebc87230 cache pdf files generated by epstopdf 2018-09-11 09:44:22 +01:00
Henry Oswald
6299832a13 don't error on a bad synctex call 2018-08-23 11:32:50 +01:00
Henry Oswald
607bb74ffa reduce log level 2018-08-23 11:16:28 +01:00
Henry Oswald
b4107b7391 fse.ensureDir when running synctex and wordcount 2018-08-23 08:34:18 +01:00
Henry Oswald
5074442702 fix unit tests 2018-08-23 00:21:05 +01:00
Henry Oswald
05ddbd3a18 try changing bin to be owned by node 2018-08-23 00:10:06 +01:00
Henry Oswald
7b773474d9 improve error reporting 2018-08-23 00:00:43 +01:00
Henry Oswald
e4d28addf9 change sync to async for lockfile debugging 2018-08-22 22:17:02 +01:00
Henry Oswald
171ad0329d fix sql query checking last access time 2018-08-22 18:21:15 +01:00
Henry Oswald
834eeffda4 add time secomp 2018-08-21 18:56:53 +01:00
Henry Oswald
0f179a7c7c add log on exited error code 2018-08-21 12:02:12 +01:00
Henry Oswald
1990f20dc0 improve error reporting 2018-08-20 10:12:32 +01:00
Henry Oswald
407c7c235b Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-08-19 11:46:11 +01:00
Henry Oswald
988f177f79 added loads of debugging 2018-08-19 11:38:27 +01:00
Christopher Hoskin
c6f49f04a9 Merge pull request #95 from sharelatex/csh-sentry
read sentry dsn from env var into config
2018-08-15 11:49:34 +01:00
Christopher Hoskin
a26d7093b4 Merge branch 'ho-dockerise' into csh-sentry 2018-08-15 09:44:02 +01:00
Henry Oswald
eec0529ef7 put FILESTORE_PARALLEL_FILE_DOWNLOADS and
FILESTORE_PARALLEL_SQL_QUERY_LIMIT into env vars
2018-08-14 15:17:56 +01:00
Christopher Hoskin
382f30f810 Revert "Put a guard on sentry dsn"
This reverts commit 95e052d059.
2018-08-13 17:36:53 +01:00
Christopher Hoskin
95e052d059 Put a guard on sentry dsn 2018-08-13 12:27:13 +01:00
Christopher Hoskin
9f79229835 Read sentry dsn from env 2018-08-03 15:33:53 +01:00
Henry Oswald
95b2e8caae comment out erroring log for moment 2018-08-01 14:32:17 +01:00
Henry Oswald
3890cdec37 null check host options 2018-08-01 14:10:22 +01:00
Henry Oswald
3e3468d9e9 reduce logging 2018-08-01 13:59:09 +01:00
Henry Oswald
9ef9a3b780 make Settings.parallelSqlQueryLimit a config setting 2018-07-31 14:38:24 +01:00
Henry Oswald
ee518c1755 fix expired projects command 2018-07-30 17:37:30 +01:00
Henry Oswald
3a9206f1e7 fix missing cb’s 2018-07-30 17:01:59 +01:00
Henry Oswald
d1ce49d6d7 add db queue file for global db query queues 2018-07-30 16:46:47 +01:00
Henry Oswald
627bed428e added a queue with 1 concurency to db queries 2018-07-30 16:22:04 +01:00
Henry Oswald
92e1240635 added some debugging 2018-07-30 15:18:25 +01:00
Henry Oswald
94a52333f7 add sync= off and read_uncommited=true to improve perf 2018-07-30 15:16:06 +01:00
Henry Oswald
c490479a1a remove some console.logs 2018-07-30 15:11:41 +01:00
Henry Oswald
f802717cb5 remove password from clsi for sql
sequalise fails when it is set to null
2018-07-30 14:04:33 +01:00
Henry Oswald
0eeee4284d bump retried and package versions 2018-07-30 11:25:28 +01:00
Henry Oswald
e1c23be845 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-07-26 16:52:26 +01:00
Henry Oswald
67d34fdaf0 dd wal logging 2018-07-26 16:12:26 +01:00
Christopher Hoskin
465dc31e75 Push images to overleaf-ops 2018-07-18 11:32:41 +01:00
Henry Oswald
2b6032b249 only set wal for sqlite 2018-07-17 12:53:07 +01:00
Henry Oswald
3478c28fa3 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-07-17 12:52:18 +01:00
Henry Oswald
3e26efe06f add PRAGMA journal_mode=WAL; 2018-07-17 12:50:33 +01:00
Christopher Hoskin
fb00098fc0 Bump build script to 1.1.8, drop csh-gcdm-test and csh-staging repos 2018-07-17 12:10:08 +01:00
Brian Gough
33092baf90 Merge branch 'master' of github.com:sharelatex/clsi-sharelatex 2018-07-17 10:41:14 +01:00
Brian Gough
4830e9f785 allow prune to fail to prevent build from terminating 2018-07-17 10:41:10 +01:00
Brian Gough
368f9b1c5d Merge pull request #91 from sharelatex/bg-increase-wordcount-timeout
increase timeout on wordcount
2018-07-17 10:10:36 +01:00
Henry Oswald
bcb87620b5 change override to leave image name so it works for wl_texlive 2018-07-16 17:25:14 +01:00
Henry Oswald
dd015a05cb remove express header 2018-07-16 15:38:23 +01:00
Henry Oswald
8d846f64a9 move texliveImageNameOveride further down request so it works for
compile tests
2018-07-13 11:52:49 +01:00
Henry Oswald
3545852173 quick hack to overright image name further down stack 2018-07-13 11:46:37 +01:00
Henry Oswald
7fc9412141 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-07-13 10:42:27 +01:00
Henry Oswald
a960614eb4 added texliveImageNameOveride 2018-07-13 10:37:22 +01:00
Christopher Hoskin
38bd598eb4 Merge pull request #94 from sharelatex/csh-remote-statsd
Depend on metrics v1.8.1 for remote StatsD host
2018-07-12 12:43:40 +01:00
Christopher Hoskin
97716365af Depend on metrics v1.8.1 for remote StatsD host 2018-07-12 11:22:02 +01:00
Christopher Hoskin
c1277e9f22 Use our experimental metrics 2018-07-06 15:08:38 +01:00
Henry Oswald
a75cec7d52 added maint down endpoint 2018-07-05 15:07:07 +01:00
Henry Oswald
6464aefdb4 added filestoreDomainOveride 2018-07-03 16:41:34 +01:00
Henry Oswald
ec85957ae4 add load balance http endpoints to shut box down 2018-06-28 16:04:34 +01:00
Henry Oswald
4bfc02ef3b fix seccomp key 2018-06-26 15:38:30 +01:00
Henry Oswald
364c8097c8 add error catch to settings.defaults 2018-06-26 15:04:56 +01:00
Henry Oswald
911e1d58f7 put seccomp_profile_path into variable and try catch 2018-06-26 14:44:03 +01:00
Henry Oswald
dd93d37460 added seccomp 2018-06-26 12:43:47 +01:00
Brian Gough
82b996b145 increase timeout on wordcount 2018-06-25 14:06:18 +01:00
Christopher Hoskin
b3033c1686 Add csh-staging to repos 2018-06-13 15:47:45 +01:00
Christopher Hoskin
547ef679b4 Merge pull request #89 from sharelatex/csh-issue-601
Csh issue 601
2018-06-13 15:45:17 +01:00
Henry Oswald
b30890ef99 remove the compile npm command, it isn't needed 2018-06-12 17:48:23 +01:00
Henry Oswald
926667f365 update build scripts so smoke tests are compiled 2018-06-12 17:44:13 +01:00
Christopher Hoskin
0a70985ba5 Specify repo correctly 2018-06-12 15:26:10 +01:00
Christopher Hoskin
4ca8027cb8 Increase acceptance test timeout. 2018-06-12 15:04:14 +01:00
Christopher Hoskin
da216c52e9 Accidently left warning message commented out :( 2018-06-12 11:17:26 +01:00
Christopher Hoskin
e6532b5681 Update build scripts from 1.1.3 to 1.1.6 2018-06-12 10:22:30 +01:00
Christopher Hoskin
85aec72206 Use metadata to determine Google Cloud project dynamically. Fixes: #601 2018-06-12 10:15:17 +01:00
Henry Oswald
f000ecb681 Merge branch 'master' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-06-08 19:21:18 +01:00
Henry Oswald
436f69f3a6 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-05-25 15:33:08 +01:00
Henry Oswald
38e91ab3e4 bumped timeout to 30 seconds 2018-05-25 15:30:26 +01:00
henry oswald
0b3af7d759 change synctex binary and added it to mounted volumes in docker config 2018-05-25 13:45:07 +00:00
henry oswald
9548615169 all but the sync tests should pass 2018-05-25 12:43:12 +00:00
Henry Oswald
da814b0e3a log settings on startup 2018-05-25 12:01:16 +01:00
Henry Oswald
e544ad9a23 set user to tex for tests run on ci box 2018-05-25 11:51:34 +01:00
Henry Oswald
1814f1c997 added --exit to unit tests 2018-05-24 21:59:02 +01:00
Henry Oswald
98a4e60eb7 update to 1.1.3 build scripts 2018-05-24 19:03:57 +01:00
Alberto Fernández-Capel
039d5e01ec Merge pull request #87 from sharelatex/afc-travis-nvmrc
Make travis read the node version from the .nvmrc file
2018-05-01 09:45:55 +01:00
Alberto Fernández Capel
1d38dd3a92 Make travis read the node version from the .nvmrc file
See https://docs.travis-ci.com/user/languages/javascript-with-nodejs/#Specifying-Node.js-versions-using-.nvmrc
2018-05-01 09:25:37 +01:00
Henry Oswald
ca23cd42ad update package.json scripts 2018-04-09 11:06:35 +01:00
Henry Oswald
b330ee2d5b grep works with command
updated build scripts
acceptence tests break, files are written as root when user is node
2018-03-29 17:07:22 +01:00
Henry Oswald
b5a7eabaab update build script and add load balancer agent 2018-03-29 12:12:29 +01:00
Henry Oswald
ec75f9fa67 add smoke test env var 2018-03-20 13:48:12 +00:00
Henry Oswald
dc1ea9d3e9 ammend comment 2018-03-19 14:22:18 +00:00
Henry Oswald
4d955a8d41 try a build with node user 2018-03-19 14:10:45 +00:00
Henry Oswald
0915ac8c60 run as app user and chmod 777 compiles dir 2018-03-19 12:56:53 +00:00
Henry Oswald
aeb6f48945 try running as root 2018-03-19 09:51:26 +00:00
Henry Oswald
8ccbfc7d32 don't put synctex in as a volume 2018-03-16 18:11:46 +00:00
Henry Oswald
0bd9377018 chown synctex and add the creation of directories in 2018-03-16 17:48:55 +00:00
Henry Oswald
3c1d7ab264 mkdir the /app/bin/synctex-mount 2018-03-16 17:40:10 +00:00
Henry Oswald
3d9a93ad61 add logging of docker options 2018-03-16 17:37:36 +00:00
Henry Oswald
17c51c2ba0 added debugging and new moving commands 2018-03-16 17:30:11 +00:00
Henry Oswald
f4226ecd0e try copying synctex betwen directories 2018-03-16 17:10:56 +00:00
Henry Oswald
6fbfcfc68b move synctex into a directory for simple mounting 2018-03-16 16:50:30 +00:00
Henry Oswald
63145cc60c add synctex back in 2018-03-16 16:22:39 +00:00
Henry Oswald
5739a2aeca comment out synctex for moment 2018-03-16 16:04:26 +00:00
Henry Oswald
9f8a68be38 add log line for connecting to a db 2018-03-16 15:29:35 +00:00
Henry Oswald
1dce40c61f make compiles dir 2018-03-16 15:25:36 +00:00
Henry Oswald
52982b8fcd remove texlive docker images 2018-03-14 15:44:58 +00:00
Henry Oswald
a741a238a8 have entrypoint kickoff download off texlive images
install script exits without error if auth fails.
2018-03-14 15:44:58 +00:00
Henry Oswald
0c1b699bd5 add docker ignore rather than make clean 2018-03-14 15:44:58 +00:00
Henry Oswald
dc3cb439d0 update build scripts 2018-03-14 15:44:58 +00:00
Henry Oswald
83c7068bd1 test new scripts on ci 2018-03-14 15:44:58 +00:00
Henry Oswald
b9d94fb428 fixed commended tests 2018-03-14 15:44:58 +00:00
Henry Oswald
7dbed15fea update scripts from latest build scripts 1.1.0 2018-03-14 15:44:58 +00:00
Henry Oswald
3c4870f688 remove touch /var/run/docker.sock which doesn’t work robustly 2018-03-14 15:44:58 +00:00
Henry Oswald
4ff1121353 add cmd back in 2018-03-14 15:44:58 +00:00
Henry Oswald
aca9100c52 set entry point for dockerfile 2018-03-14 15:44:58 +00:00
Henry Oswald
96a237fb74 removed user temporarly, created make ci task 2018-03-14 15:44:58 +00:00
Henry Oswald
4e6514b17e add logging in db.coffee 2018-03-14 15:44:58 +00:00
Henry Oswald
00cf5468d0 update jenkins task 2018-03-14 15:44:58 +00:00
Henry Oswald
177c46df98 add cache dir 2018-03-14 15:44:58 +00:00
Henry Oswald
2f96350b7c removed unused scripts 2018-03-14 15:44:58 +00:00
Henry Oswald
f1df41112b wip for ci 2018-03-14 15:44:58 +00:00
Henry Oswald
b202af3cf2 added docker runner into core codebase
supports both local command runner and docker runner

added docker files for tex live

also fixed tests so they exit correctly & removed debug lines
2018-03-14 15:44:49 +00:00
Henry Oswald
3bdd50a231 fix url fetcher tests so they exit correctly 2018-03-05 10:39:46 +00:00
Henry Oswald
3134b8aada add SYNCTEX_BIN_HOST_PATH for ci 2018-03-03 13:40:29 +00:00
Henry Oswald
aa0f9ee0be Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-03-03 13:37:00 +00:00
Henry Oswald
4dd11f3442 update docker compose ci to use extension file and dockerfile 2018-03-03 13:36:42 +00:00
Henry Oswald
ae7357778e Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-03-02 18:31:09 +00:00
Henry Oswald
c6b962a8b9 Merge branch 'master' into ho-dockerise 2018-03-02 18:18:18 +00:00
Henry Oswald
3de14a3f17 Merge branch 'master' into ho-dockerise 2018-03-02 18:16:16 +00:00
Henry Oswald
49a35c5e11 Merge branch 'master' into ho-dockerise 2018-03-02 18:12:32 +00:00
Henry Oswald
b9874b5ae5 built with 1.1.0 scripts 2018-03-02 18:08:13 +00:00
Henry Oswald
5cb3bfcbbb uncomment tests 2018-03-02 17:59:37 +00:00
Henry Oswald
1a47887e80 make timeout latex more complex(slower) 2018-03-02 17:58:34 +00:00
Henry Oswald
70f016af1f unit tests pass, acceptence fail
uncomment tests
2018-03-02 17:34:41 +00:00
Henry Oswald
b8c22f4d74 wip, docker container is correctly created 2018-03-02 17:14:23 +00:00
Henry Oswald
8f6db5baff tests pass under app user 2018-03-02 17:14:23 +00:00
Henry Oswald
d698cc318f updateded build scripts 2018-03-02 17:14:23 +00:00
Henry Oswald
12b13d6199 mount app as volume in docker container for local tests
change to overrides
2018-03-02 17:14:23 +00:00
Henry Oswald
a02adacc98 updated build sripts with 1.0.3 2018-03-02 17:14:23 +00:00
Henry Oswald
a2a8b70b74 acceptence tests pass inside docker container (apart from sync) 2018-03-02 17:14:23 +00:00
Henry Oswald
017ba3a4ec mvp
needs hacked pacth in docker runner

wip

most tests pass
2018-03-02 17:14:20 +00:00
James Allen
b64106b730 Provide hosts and siblings container as environment settings and add npm run start script
wip acceptence tests run, but don't all pass

wip

removed npm-debug from git
2018-03-02 17:14:18 +00:00
James Allen
12c1dc632a Merge pull request #83 from sharelatex/ja-dockerize-dev
Provide hosts as settings and add npm run start script
2018-01-16 17:08:09 +00:00
James Allen
7a6294081d Allow texlive image user to be configured 2018-01-16 10:46:59 +00:00
Brian Gough
7d8a18c46c Merge pull request #82 from sharelatex/bg-log-core-files-as-error
log an error if core file is found in output
2018-01-04 09:22:44 +00:00
Brian Gough
a0d5e6a54b log an error if core file is found in output 2018-01-03 15:41:31 +00:00
James Allen
f58ef67875 Provide hosts and siblings container as environment settings and add npm run start script 2017-12-29 08:08:19 +00:00
Joe Green
6d42e18088 Add a 1 second delay to the smoke tests (#81)
* Add a 1 second delay to the smoke tests

Fixes a race condition where smoke tests exit before container can be attached to.

See here for more info: https://github.com/overleaf/sharelatex/issues/274

* give the smoke tests additional work to do

* escape slashes
2017-12-05 16:51:59 +00:00
Joe Green
ef0db41dae Merge pull request #80 from sharelatex/jg-smoketest-interval
Increase smoke test interval to 30 seconds
2017-11-29 15:34:49 +00:00
Joe Green
3692570df0 Increase smoke test interval to 30 seconds
The smoke tests can sometimes take ~20 seconds to complete, which causes the http POST to time out. This should solve that problem.
2017-11-29 11:01:51 +00:00
Brian Gough
8255997fad Merge pull request #79 from sharelatex/bg-fix-listen-in-acceptance-tests
exit if mock server fails to start
2017-10-25 09:13:33 +01:00
Brian Gough
360e8220ce exit if mock server fails to start 2017-10-20 15:16:35 +01:00
Joe Green
23f4f2175c Update Jenkinsfile 2017-10-16 14:13:51 +01:00
Joe Green
eb35cab72d only alert on master 2017-10-12 16:54:54 +01:00
Brian Gough
48b2548533 Merge pull request #78 from sharelatex/bg-fix-read-logging
fix read logging
2017-10-02 16:12:12 +01:00
Brian Gough
86cc30d8fa fix typo in log message 2017-10-02 15:45:09 +01:00
Brian Gough
60ad425205 move logging from SafeReader into caller
prevent unnecessary logging when looking at headers of files where
hitting the end of the file is expected.
2017-10-02 15:44:00 +01:00
Brian Gough
d63f339fc4 Merge pull request #77 from sharelatex/bg-fix-tikzexternalize-II
fix tikzexternalize ii
2017-10-02 11:19:06 +01:00
Brian Gough
1da918e13c simplify tikzexternalize checks 2017-09-29 17:00:53 +01:00
Brian Gough
d1aa1d84fb keep tikzexternalize files 2017-09-29 16:02:23 +01:00
Joe Green
88eafdf575 Update Jenkinsfile 2017-09-28 13:46:01 +01:00
Brian Gough
d8858cfadd Merge branch 'bg-lock-compiles' 2017-09-28 13:16:29 +01:00
Joe Green
fd0cbb2c52 use npm cache in CI build 2017-09-28 11:51:41 +01:00
Joe Green
bd5a0ef36f Jg jenkinsfile cleanup (#75)
* Update Jenkinsfile

make sure we don't ship unneeded build files

* Update ExampleDocumentTests.coffee

* use node 6.11.2 in jenkins file
2017-09-28 11:50:33 +01:00
Brian Gough
1388093866 Merge pull request #73 from sharelatex/bg-handle-dot-files-in-resource-list
handle dot files in resource list
2017-09-28 09:59:27 +01:00
Joe Green
c3e3e3d8ac Update Jenkinsfile 2017-09-26 11:44:48 +01:00
Brian Gough
23fec68111 use a separate function for hidden file check 2017-09-26 11:03:20 +01:00
Brian Gough
dbeff9a7b8 exclude hidden files from output
express static server doesn't serve them and rejects with 404
2017-09-26 10:42:59 +01:00
Brian Gough
f11468b595 remove stat test for missing files 2017-09-26 09:48:09 +01:00
Brian Gough
0930b1cd8f only exclude clsi-specific files from output list 2017-09-26 09:47:29 +01:00
Brian Gough
a36ec7f54e fix comment 2017-09-25 16:06:45 +01:00
Brian Gough
eaa99c7274 fix unit tests for use of fs-extra 2017-09-25 15:28:31 +01:00
Brian Gough
b0f879d652 lock compile directory 2017-09-22 16:19:33 +01:00
Brian Gough
8305268848 unit tests for ResourceStateManager 2017-09-15 13:42:57 +01:00
Brian Gough
aa5eeb0903 fallback check for missing files
dot files are not examined by OutputFileFinder, so do an extra check to
make sure those exist

also check for any relative paths in the resources
2017-09-15 13:41:56 +01:00
Brian Gough
2af05030f2 Merge pull request #71 from sharelatex/bg-merge-state-and-resource-list-files
merge state and resource list files
2017-09-11 08:54:30 +01:00
Joe Green
d04f93855b Add jenkinsfile (#72)
* create Jenkinsfile

* allow textlive image to be set with env vars

* log error message in test

* use sandboxed compiles variables

* Add SANDBOXED_COMPILES_HOST_DIR var to test config

* add SIBLING_CONTAINER_USER env var
2017-09-08 14:06:04 +01:00
Brian Gough
a2c97e6f9a rename saveProjectStateHash to saveProjectState 2017-09-08 13:56:40 +01:00
Brian Gough
acab9d45a0 log any missing files 2017-09-07 16:54:09 +01:00
Brian Gough
0fac2655f7 fix whitespace 2017-09-07 13:52:34 +01:00
Brian Gough
c1ca32184f log error if state file is truncacted 2017-09-07 13:52:34 +01:00
Brian Gough
97d7d76e61 combine the resource state and resource list
to prevent them getting out of sync
2017-09-07 13:52:34 +01:00
Shane Kilkelly
d865fda6a9 Merge pull request #70 from sharelatex/sk-node-6
Upgrade to node 6.11
2017-08-31 13:35:27 +01:00
Shane Kilkelly
3d053a2e34 Upgrade to node 6.9 2017-08-29 14:30:43 +01:00
Brian Gough
faa2a325cb added logging 2017-08-29 12:09:31 +01:00
James Allen
b42347ea08 Merge pull request #69 from sharelatex/as-update-docker-runner-config
Update docker-runner-sharelatex config
2017-08-24 15:17:16 +02:00
Alasdair Smith
d5b3101637 Update docker-runner-sharelatex config 2017-08-24 13:34:24 +01:00
Brian Gough
c1d1f93453 Merge pull request #66 from sharelatex/bg-compile-from-redis
Write files incrementally
2017-08-23 15:35:56 +01:00
Brian Gough
fc1782e74c read resource files safely
put a limit on the amount of data read
2017-08-18 11:17:01 +01:00
Brian Gough
6921cf25b8 splice state management into ResourceStateManager 2017-08-18 10:22:17 +01:00
Brian Gough
0b9ddb8efe fix whitespace 2017-08-18 09:41:59 +01:00
Brian Gough
e8064f12a1 finish unit test for incremental update 2017-08-18 09:41:43 +01:00
Brian Gough
e4aad90f33 ResourceWriter unit tests (wip) 2017-08-17 16:59:37 +01:00
Brian Gough
a8aaf58e64 test syncType in RequestParser 2017-08-17 15:57:05 +01:00
Brian Gough
5b5f7b0690 avoid adding draft mode more than once 2017-08-17 15:03:37 +01:00
Brian Gough
2b610030d5 store the resource list in a file 2017-08-17 14:53:35 +01:00
Brian Gough
00ddfdf42b fix unit tests 2017-08-09 15:22:44 +01:00
Brian Gough
c25e96bbc3 add comment about syncType/syncState 2017-08-09 15:22:38 +01:00
Henry Oswald
4eb8c107c9 Merge pull request #68 from sharelatex/ho-mkdir-cache-comiles
use grunt to make compiles and cache dirs
2017-08-09 11:07:36 +01:00
Brian Gough
86fa940c97 clean up the state file if no state passed in 2017-08-08 16:29:57 +01:00
Henry Oswald
7cd81ac3df use grunt to make compiles and cache dirs 2017-08-07 16:21:37 +01:00
Henry Oswald
fdc22c9cd2 Merge pull request #67 from sharelatex/revert-65-add-compiles-folder
Revert "Keep compiles and cache directories"
2017-08-07 15:29:30 +01:00
Henry Oswald
c3fe17d0b6 Revert "Keep compiles and cache directories" 2017-08-07 15:29:18 +01:00
Brian Gough
206adc2d04 fix broken unit tests 2017-08-07 15:00:16 +01:00
Brian Gough
6542ce20b6 fix incremental request 2017-08-07 14:32:28 +01:00
Brian Gough
b4be40d061 restrict syncType values to full/incremental 2017-08-07 10:19:56 +01:00
Brian Gough
11898b897e added files out of sync error object 2017-08-03 15:56:59 +01:00
Brian Gough
74c26120b2 use syncType and syncState for clsi state options 2017-08-03 12:00:32 +01:00
Brian Gough
7e1d3d98e7 write files incrementally 2017-08-02 13:46:10 +01:00
Henry Oswald
d5e0ab5a6f Merge pull request #65 from sharelatex/add-compiles-folder
Keep compiles and cache directories
2017-07-28 11:24:36 +01:00
Hayden Faulds
4c105e7826 keep cache directory 2017-07-27 15:54:20 +01:00
Hayden Faulds
cd5adaff51 keep compiles directory 2017-07-27 14:02:24 +01:00
Henry Oswald
e5081df2a9 Revert "change"
This reverts commit 104ce81ebd.
2017-07-23 22:45:04 +01:00
Henry Oswald
104ce81ebd change 2017-07-23 22:42:07 +01:00
Brian Gough
08fd440df5 Merge pull request #63 from sharelatex/bg-fix-tikzmanager-exception
fix tikzmanager exception
2017-07-20 13:22:58 +01:00
Brian Gough
11cd569ed9 stub out unwanted dependency in unit tests 2017-07-18 11:30:22 +01:00
Brian Gough
472531f617 fix exception for empty content in TikzManager 2017-07-18 11:29:59 +01:00
Brian Gough
ea34a1a89d update acceptance test images for texlive 2017 2017-07-13 13:15:51 +01:00
Brian Gough
2e91f07014 update acceptance tests settings to 2017 image 2017-07-12 16:59:33 +01:00
Shane Kilkelly
6f322583f7 Merge branch 'sk-reduce-kill-project-errors' 2017-06-27 10:03:51 +01:00
Shane Kilkelly
a74f4ac1a6 Send a 404 if the project files have gone away when running synctex.
This is semantically nicer than the 500 response which used to be
produced in these circumstances.
2017-06-23 14:46:40 +01:00
Shane Kilkelly
aa1dd2bf05 Killing an already stopped project is not an error
Log a warning instead and continue.
2017-06-20 09:18:15 +01:00
Shane Kilkelly
8e2584bab4 Mock out logger in tests 2017-06-20 08:25:50 +01:00
Brian Gough
f8530da626 Merge pull request #60 from sharelatex/bg-delete-xdv-files
delete intermediate xdv files from xelatex
2017-06-16 09:13:43 +01:00
Brian Gough
2edc015663 delete intermediate xdv files from xelatex 2017-06-15 15:37:45 +01:00
Brian Gough
f94e9989ec Merge pull request #58 from sharelatex/bg-check-dir-before-synctex
check file exists before running synctex
2017-05-31 10:16:06 +01:00
Brian Gough
c62f8b4854 check directory exists and bail out on error 2017-05-31 10:06:27 +01:00
Brian Gough
2d389130cc Merge pull request #59 from sharelatex/bg-reduce-clsi-error-reporting
don't report compile timeouts to sentry
2017-05-30 15:39:04 +01:00
Brian Gough
aafa691119 check file exists before running synctex 2017-05-24 10:09:43 +01:00
Brian Gough
a98b2b8032 don't report compile timeouts to sentry
just log them instead
2017-05-24 09:42:05 +01:00
Brian Gough
398ba5ae34 Merge pull request #56 from sharelatex/bg-disable-qpdf-setting
add setting to avoid optimisations outside docker
2017-04-11 14:16:19 +01:00
Brian Gough
a1613eac5a add setting to avoid optimisations outside docker 2017-04-10 16:12:03 +01:00
Brian Gough
3526fde665 Merge pull request #55 from sharelatex/bg-check-pdf-output-is-optimised
use pdfinfo on output to ensure pdfs are optimised
2017-04-10 15:06:22 +01:00
Brian Gough
e1b44beb3f use pdfinfo on output to ensure pdfs are optimised
needed to check that qpdf runs correctly inside the docker container
2017-04-07 11:11:27 +01:00
Brian Gough
17b16dadcd Merge pull request #54 from sharelatex/bg-avoid-running-qpdf-on-already-optimised-files
check if file is optimised before running qpdf
2017-04-05 13:18:32 +01:00
Brian Gough
eb1364f249 check if file is optimised before running qpdf 2017-04-04 16:50:06 +01:00
Shane Kilkelly
834ad57312 Add a .nvmrc file 2017-03-27 14:47:48 +01:00
Brian Gough
19dfaa7d55 Merge pull request #53 from sharelatex/bg-sanitise-paths
additional check for valid rootResource
2017-03-21 13:39:27 +00:00
Brian Gough
b529b8add3 Merge pull request #52 from sharelatex/bg-tikz-externalize
support for tikz externalize
2017-03-21 13:39:14 +00:00
Brian Gough
7ccc9500ed check for \tikzexternalize directly
instead of \usepackage{tikz} and \usepackage{pgf}
2017-03-21 11:36:08 +00:00
Brian Gough
750576d1b0 fix path match 2017-03-21 11:30:32 +00:00
Brian Gough
021d848819 create separate function for path checking 2017-03-21 11:29:37 +00:00
Brian Gough
8803762081 support for tikz externalize
make copy of main file as output.tex for tikz externalize
2017-03-20 10:55:28 +00:00
Brian Gough
5af137f60b additional check for valid rootResource 2017-03-20 10:03:48 +00:00
Brian Gough
f059948e27 update xelatex acceptance test pdf 2017-03-08 11:49:21 +00:00
Brian Gough
7a7c2ee992 improve debugging of failed acceptance tests
use the example name in the output filename
2017-03-08 11:49:12 +00:00
Brian Gough
efe5e22b4c include otf extension in fontawesome test 2017-03-08 11:25:25 +00:00
Shane Kilkelly
03d1936fde Upgrade logger 2017-03-06 14:56:32 +00:00
Shane Kilkelly
a0969ec839 Don't compile acceptance test files during test run 2017-03-06 14:43:14 +00:00
Brian Gough
fdab7763a2 Merge pull request #51 from sharelatex/bg-fix-latexmk-args
allow latexmk to pass through options
2017-03-03 13:19:23 +00:00
Brian Gough
57a5cfa9cb allow latexmk to pass through options
this avoids problems in the latest version of latexmk where the
$pdflatex variable has been replaced by $xelatex and $lualatex when
running with -xelatex or -lualatex
2017-03-02 16:43:35 +00:00
Joe Green
bfb27e6c25 Merge pull request #50 from sharelatex/ho-remove-tcp
remove tcp code, moved to agent load balancer
2017-02-23 14:42:54 +00:00
Henry Oswald
d4d3048719 remove tcp code, moved to agent load balancer 2017-02-23 11:09:18 +00:00
Brian Gough
29594fd0f7 fix acceptance test config file for latex prefix
latex command prefix was in wrong scope
2017-02-21 09:37:05 +00:00
Brian Gough
a50582fd7c add fontawesome acceptance test for xelatex 2017-02-21 09:37:05 +00:00
Henry Oswald
08f0955817 Merge pull request #49 from sharelatex/ho-one-cpu-size
if host has 1 cpu (staging) then set availableWorkingCpus to 1
2017-02-20 15:20:04 +00:00
Henry Oswald
bc1b8f4b2f Update app.coffee 2017-02-20 15:19:04 +00:00
Henry Oswald
599977c3e0 if host has 1 cpu (staging) then set availableWorkingCpus to 1 2017-02-20 15:16:52 +00:00
Brian Gough
071b2269b3 update acceptance tests for reversion to dvipdf 2017-02-13 13:42:44 +00:00
Brian Gough
fde8149579 fix #! in test script 2017-02-09 15:38:25 +00:00
Brian Gough
6b7e33bbc6 show debug info for acceptance tests 2017-02-09 14:17:38 +00:00
Brian Gough
2898a82de8 update acceptance test output for fontawesome 2017-02-07 11:51:21 +00:00
Brian Gough
5b71b849ca added fontawesome acceptance test 2017-02-07 10:00:41 +00:00
Brian Gough
6cb5926c21 fix lualatex require 2017-02-07 08:59:45 +00:00
Brian Gough
3cffb61c74 add luatex85 package to tikz feynman test 2017-02-07 08:49:19 +00:00
Brian Gough
5705455ce1 added acceptance test for tikz-feynman 2017-02-07 08:12:47 +00:00
Brian Gough
71fb15e0ee update knitr_utf acceptance test output
needs to include table of contents from multiple latexmk runs
2017-02-06 16:27:47 +00:00
Brian Gough
819c642b8d add knitr utf8 acceptance test 2017-02-03 15:38:06 +00:00
Brian Gough
20cb52793d add acceptance test for hebrew 2017-02-03 15:16:47 +00:00
Brian Gough
e507bd6394 update acceptance test image for lualatex
small pixel-level change in output
2017-01-31 16:04:59 +00:00
Brian Gough
444b3586a7 increase debugging in acceptance tests 2017-01-31 10:47:49 +00:00
Brian Gough
e25ebd296e add debugging to acceptance tests 2017-01-31 10:40:05 +00:00
Brian Gough
5090ad5c41 update feymp test image
minor pixel change in position of labels in texlive 2016
2017-01-31 10:21:00 +00:00
Brian Gough
bc73f719b2 update asymptote pdf to a4 size for texlive 2016 2017-01-31 09:53:36 +00:00
Brian Gough
d238f73e29 try output.pdf generated with texlive 2016 2017-01-30 15:37:26 +00:00
Brian Gough
ea484da9f4 update latex_compiler test pdf 2017-01-27 12:32:14 +00:00
Brian Gough
b76a81e98b specify papersize explicitly in latex test 2017-01-27 12:21:57 +00:00
Brian Gough
f00be9018d log acceptance test server output to file 2017-01-26 12:20:41 +00:00
Brian Gough
146138f65c try running user as jenkins 2017-01-26 12:06:38 +00:00
Brian Gough
654a43655f update image for docker tests 2017-01-25 14:12:19 +00:00
Brian Gough
b9d6db6caf use local docker image for clsi test 2017-01-25 14:09:44 +00:00
Brian Gough
03e837c1f4 run tests outside container, add settings file 2017-01-25 14:08:39 +00:00
Brian Gough
420db18a03 upgrade to latest sqlite3 2017-01-24 16:06:32 +00:00
Brian Gough
dab92967c8 added docker script for acceptance tests 2017-01-24 12:18:30 +00:00
Brian Gough
0530e21246 fix acceptance tests 2017-01-24 11:07:54 +00:00
Brian Gough
9e53c0b99e fix exception in error log 2016-10-14 10:23:13 +01:00
Shane Kilkelly
61089eca40 Increase memory limit to 64mb 2016-09-28 11:02:58 +01:00
Shane Kilkelly
4827aec30b Add test for new ulimit options 2016-09-23 15:34:29 +01:00
Shane Kilkelly
0900340282 Add CHKTEX_ULIMIT_OPTIONS 2016-09-23 15:32:37 +01:00
James Allen
f7b4883397 Don't delete knitr cache files 2016-09-22 14:14:29 +01:00
James Allen
79b3d2172b Sanitize resource path along with rootResourcePath 2016-09-21 15:09:01 +01:00
Brian Gough
9f49dc8554 Merge pull request #45 from sharelatex/fix-chktex-for-knitr
only run chktex on .tex files, not .Rtex files
2016-09-12 16:36:59 +01:00
Brian Gough
ee170b4e67 only run chktex on .tex files, not .Rtex files
the .tex files produced from knitr have macros which confuse chktex
2016-09-12 16:29:36 +01:00
Shane Kilkelly
47105190be Revert "Revert "Revert "Upgrade to node 4.2"""
This reverts commit 98fb2cab99.
2016-09-01 12:47:13 +01:00
Shane Kilkelly
98fb2cab99 Revert "Revert "Upgrade to node 4.2""
This reverts commit 4128dc6fdd.
2016-09-01 11:22:11 +01:00
Shane Kilkelly
4128dc6fdd Revert "Upgrade to node 4.2"
This reverts commit 8bb12f4d99.
2016-09-01 09:53:12 +01:00
Shane Kilkelly
4a2b2a8707 Merge branch 'master' into sk-node-upgrade 2016-08-31 16:34:25 +01:00
Brian Gough
095e16e953 handle failed compile due to validation error 2016-08-24 15:46:47 +01:00
Brian Gough
3a73971b42 fix commandRunner error to match dockerRunner 2016-08-24 15:45:26 +01:00
Brian Gough
748caeee7d remove chktex error
too many false positives from 'unable to execute latex command'
2016-08-22 15:11:39 +01:00
Brian Gough
cd7ed6ce66 update tests 2016-08-11 10:31:37 +01:00
Brian Gough
2200ac2cf2 capture texcount error output 2016-08-11 10:26:08 +01:00
Brian Gough
928ffc96e6 read wordcount output asynchronously 2016-08-11 09:32:53 +01:00
Brian Gough
ade3da7e0d add missing argument parameter to wordcount call 2016-08-11 09:29:03 +01:00
Brian Gough
e66b1ecdea use a command wrapper for synctex
instead of an alternative child_process object
2016-08-04 16:08:14 +01:00
Brian Gough
c6744caeeb change logging message to be different from LatexRunner 2016-08-04 16:07:36 +01:00
Brian Gough
189648e39a Merge pull request #44 from sharelatex/add-chktex-support
Add chktex support
2016-08-02 14:55:38 +01:00
Brian Gough
8da29e6024 provide setting to override child_process.execFile for synctex 2016-07-29 14:54:24 +01:00
Brian Gough
664e908378 provide validation mode where compilation always exits after chktex 2016-07-27 16:54:27 +01:00
Brian Gough
14837a57ec run chktex when request has check:true 2016-07-26 16:22:38 +01:00
Brian Gough
6524439699 add support for passing additional environment parameters to command runner
includes an example of passing environment variables to chktex
2016-07-26 12:30:29 +01:00
Brian Gough
a7c7f2697f Merge pull request #43 from sharelatex/stop-compile
add support for stopping compile
2016-07-18 11:16:53 +01:00
Brian Gough
fdf274fb82 remove dead code 2016-07-18 11:05:45 +01:00
Brian Gough
69666bef60 add support for stopping compile 2016-07-14 16:43:52 +01:00
Henry Oswald
cd8e60195c Merge pull request #42 from WaeCo/patch-1
Set default project_cache_length_ms to 1 day
2016-07-13 21:32:02 +01:00
WaeCo
d6808c11cc Set default project_cache_length_ms to 1 day
`project_cache_length_ms` was only `60*60*24 = 1.5 min` which is a little bit short. Default of one day seams more reasonable.
2016-07-13 13:26:32 -07:00
Brian Gough
133f522e7b Merge pull request #41 from sharelatex/per-user-containers-part-3
Reduce number of cached builds for per-user containers
2016-06-30 08:06:01 +01:00
Brian Gough
d29416fc77 keep one extra build until per-page pdf serving is enabled 2016-06-29 16:31:16 +01:00
Brian Gough
c486d6c215 only keep a single cached output directory in per-user containers 2016-06-28 09:28:40 +01:00
Shane Kilkelly
8bb12f4d99 Upgrade to node 4.2 2016-06-20 09:31:30 +01:00
Shane Kilkelly
e4ffc94de8 Move the latexmk timing command into a configurable latexmkCommandPrefix.
By default, no timing information will be taken.
On Linux with GNU user land, this value should be configured to `["/usr/bin/time", "-v"]`.
On Mac, gnu-time should be installed and configured to `["/usr/local/bin/gtime", "-v"]`.
2016-06-17 14:38:08 +01:00
Brian Gough
0b8435e358 add route to serve files from top level of per user containers 2016-06-15 16:12:19 +01:00
Brian Gough
801f09e7ed Merge branch 'per-user-containers-part-2'
Conflicts:
	app/coffee/CompileController.coffee
2016-06-13 09:33:41 +01:00
Brian Gough
603b3d617c Merge pull request #39 from sharelatex/per-user-containers-part-1
Per user containers part 1
2016-06-09 15:17:35 +01:00
Henry Oswald
b97627d6d8 use process id so link process to smoke test 2016-06-07 14:47:51 +01:00
Henry Oswald
da02661d53 add random string to smoke tests to avoid collision 2016-06-07 14:39:01 +01:00
Brian Gough
6e017ecaf1 log user_id when clearing project 2016-06-02 15:32:33 +01:00
Brian Gough
0887fe3a72 add per-user routes for clearing cache and extend expiry methods
this adds separate functionality for clearing the cache (assets and
database) and the project compile directory for a specific user
2016-06-02 15:32:33 +01:00
Brian Gough
226e6c87b1 add per-user routes and methods 2016-06-02 15:32:31 +01:00
Brian Gough
8c42a353e1 put the build id in the output file urls
the url attribute will now give the preferred location for accessing
the output file, without the url having to be constructed by the web
client
2016-06-02 15:30:50 +01:00
Brian Gough
78b88683fc put the build id in the output file urls
the url attribute will now give the preferred location for accessing
the output file, without the url having to be constructed by the web
client
2016-06-02 15:29:56 +01:00
Henry Oswald
ac3b7a571a log out error on synctex 2016-05-27 16:18:18 +01:00
Henry Oswald
cda1e301f6 log out errors more clearly 2016-05-27 14:45:39 +01:00
Henry Oswald
da324a8dd0 added logger.info to test setup 2016-05-24 14:12:02 +01:00
Henry Oswald
b2f687c061 log out which command logger is used 2016-05-24 14:08:39 +01:00
Henry Oswald
2c3b1126b0 log out if the command running is being used 2016-05-23 15:45:39 +01:00
Henry Oswald
22f730c3e9 parallelFileDownloads defaults to 1, sql can't take it 2016-05-23 14:31:27 +01:00
Henry Oswald
2e97bcba3a add error handler on CommandRunner 2016-05-23 14:13:55 +01:00
Brian Gough
0da85d5d03 be ready to serve files from per-user containers 2016-05-20 10:23:07 +01:00
Brian Gough
3379577499 fix error in log for expiry timeout 2016-05-20 10:23:07 +01:00
Henry Oswald
855169b571 Merge branch 'master' of https://github.com/sharelatex/clsi-sharelatex 2016-05-19 16:57:19 +01:00
Henry Oswald
6b107bd20a log out EXPIRY_TIMEOUT 2016-05-19 16:57:14 +01:00
Henry Oswald
a2c2fc3a51 make cached assets ttl set via config 2016-05-19 16:51:50 +01:00
Brian Gough
f8ae215c1e avoid clobbering the existing port variable 2016-05-19 16:38:18 +01:00
Brian Gough
d26c6b933e return the file path in the output file list for easy lookup 2016-05-19 16:38:18 +01:00
Brian Gough
4496ddddfd Merge pull request #38 from sharelatex/add-fast-path-to-pdf
Add fast path to pdf
2016-05-13 12:32:26 +01:00
Brian Gough
434e00cb74 make the build id a secure random token
we allow existing build ids to work for backwards compatibility
this can be removed after some time
2016-05-13 10:11:35 +01:00
Brian Gough
f92c70935b allow direct path to output file /project/project_id/build/build_id/output/*
this avoids use of the query string ?build=... and so we can match the
url directly with the nginx location directive
2016-05-13 10:10:48 +01:00
Brian Gough
51f87c5f79 fix logic excluding smoke test in metric 2016-05-10 10:10:01 +01:00
Brian Gough
143913c67f fix tagname for graphite 2016-05-10 09:41:39 +01:00
Brian Gough
dfd2bc31ef record system time 2016-05-10 09:12:13 +01:00
Brian Gough
e70bd3ae8e preserve existing metric name 2016-05-10 09:12:00 +01:00
Brian Gough
0a5ca6b0fa add timing information from /usr/bin/time 2016-05-09 16:00:24 +01:00
Brian Gough
834668b033 add a metric for the TeXLive image used on each compile 2016-05-09 15:36:11 +01:00
Henry Oswald
35240fbd4d move back to 2.5 days cache for moment 2016-04-21 17:40:09 +01:00
Henry Oswald
5f7cd5ece5 added project status endpoint
used for getting the server a project is on
2016-04-20 15:38:05 +01:00
Henry Oswald
6860d2be6c increased clsi cache to 3.5 days 2016-04-13 09:29:57 +01:00
Henry Oswald
3c021fd4c9 ignore ECONNRESET 2016-04-12 13:32:58 +01:00
Henry Oswald
f453f954e4 use socket.end for tcp checks 2016-04-12 10:49:45 +01:00
Henry Oswald
cd499fa4e5 server load endpoint uses settings for port 2016-04-11 13:47:06 +01:00
Henry Oswald
7799e0bfdd return 0 for server which is being hammered
socket.destroy when finished
2016-04-08 15:40:02 +01:00
Henry Oswald
6ca8c10734 added err handler to socket 2016-04-08 15:25:00 +01:00
Henry Oswald
84cba7365f work of 1 min load and set server as up 2016-04-08 15:18:22 +01:00
Henry Oswald
11be12fc8e evaluate on every call 2016-04-08 14:14:05 +01:00
Henry Oswald
3e70c0f8e4 added example server load tcp server 2016-04-08 13:31:23 +01:00
Brian Gough
558e9ae22b don't log errors when files have disappeared from build directory 2016-04-07 16:16:39 +01:00
Brian Gough
83e373d7e1 log errors in detail when file cannot be removed 2016-04-04 16:22:48 +01:00
Brian Gough
24fc9391c3 upgrade to the latest version of request 2016-03-31 14:03:48 +01:00
Brian Gough
7ff56c4793 suppress error when removing nonexistent file from cache 2016-03-31 13:33:42 +01:00
Brian Gough
665dbff75a parameter check on project_id 2016-03-31 12:12:25 +01:00
Brian Gough
5d6fb4579a remove console.log 2016-03-31 11:59:17 +01:00
Brian Gough
bd036534e5 check directory exists before attempting to clear it 2016-03-31 11:59:17 +01:00
Brian Gough
3dcd4af62e always create project directory when syncing resources to disk
avoids errors when project is empty
2016-03-31 11:59:17 +01:00
Brian Gough
fe46a96fd2 don't log missing files as warnings, but do report file access errors 2016-03-31 11:14:39 +01:00
Brian Gough
8fcbec5c0f add support for sentry 2016-03-30 14:35:47 +01:00
James Allen
fbb00ebf2f Only archive main log and blg 2016-03-30 14:10:07 +01:00
James Allen
6117cac1fd Ignore both .cache and .archive and other hidden files in finding output files 2016-03-30 11:41:11 +01:00
James Allen
d949d4ac32 Don't timestamp strace logs otherwise it runs as anew container each time since the command changes 2016-03-30 10:59:01 +01:00
James Allen
6af22cf184 Add in flags to run strace and capture logs 2016-03-30 10:37:22 +01:00
Brian Gough
9f104a4f57 bugfix - avoid double counting compiles 2016-03-17 14:37:34 +00:00
Brian Gough
595bfe09ac add metric for qpdf 2016-03-17 09:55:18 +00:00
Brian Gough
e64b08fcbe add metrics for latexmk runs and errors 2016-03-17 09:55:18 +00:00
Henry Oswald
dcfe1118d4 increased EXPIRY_TIMEOUT from 1.5 days to 2.5 days 2016-03-10 10:30:37 +00:00
James Allen
89acd36dde Send .svg files as text/plain to prevent executable JS if they are loaded as SVG in the browser 2016-03-10 09:32:32 +00:00
James Allen
a3383f11a1 Make draft mode regex global 2016-02-02 15:28:59 +00:00
James Allen
2df886e330 Remove left over debug log line 2016-02-02 14:28:51 +00:00
James Allen
d96605d5e8 Inject [draft] option to documentclass if draft option is passed 2016-02-02 14:26:14 +00:00
James Allen
03b75b12cf Download up to 5 files in parallel 2016-02-01 13:19:16 +00:00
James Allen
86cf05c732 Support configurable images in wordcount end point 2016-01-19 14:12:41 +00:00
James Allen
4497352a3a Allow optional image name to be passed 2016-01-15 09:59:06 +00:00
Henry Oswald
601a3e4805 Merge branch 'master' of https://github.com/sharelatex/clsi-sharelatex 2015-12-15 19:34:34 +00:00
Henry Oswald
0ea28710f5 fixed missing value in logger 2015-12-15 19:33:37 +00:00
James Allen
2b5e7be964 Remove undefined reference to dst 2015-12-03 14:54:48 +00:00
Henry Oswald
c178458223 added try catch around word count where a file is not created 2015-11-12 15:19:22 +00:00
Henry Oswald
3ed29b3489 increased cache time to 1.5 days 2015-10-21 10:02:30 +01:00
Shane Kilkelly
29be2dc700 When serving output files, intelligently determine the appropriate content-type.
cherry pick 6fa3fda3ed28239cf3ac9720629f9707663aa197 from datajoy.
2015-09-21 16:59:35 +01:00
103 changed files with 8306 additions and 551 deletions

9
.dockerignore Normal file
View File

@@ -0,0 +1,9 @@
node_modules/*
gitrev
.git
.gitignore
.npm
.nvmrc
nodemon.json
app.js
**/js/*

38
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,38 @@
<!-- BUG REPORT TEMPLATE -->
## Steps to Reproduce
<!-- Describe the steps leading up to when / where you found the bug. -->
<!-- Screenshots may be helpful here. -->
1.
2.
3.
## Expected Behaviour
<!-- What should have happened when you completed the steps above? -->
## Observed Behaviour
<!-- What actually happened when you completed the steps above? -->
<!-- Screenshots may be helpful here. -->
## Context
<!-- How has this issue affected you? What were you trying to accomplish? -->
## Technical Info
<!-- Provide any technical details that may be applicable (or N/A if not applicable). -->
* URL:
* Browser Name and version:
* Operating System and version (desktop or mobile):
* Signed in as:
* Project and/or file:
## Analysis
<!--- Optionally, document investigation of / suggest a fix for the bug, e.g. 'comes from this line / commit' -->
## Who Needs to Know?
<!-- If you want to bring this to the attention of particular people, @-mention them below. -->
<!-- If a user reported this bug and should be notified when it is fixed, provide the Front conversation link. -->
-
-

45
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,45 @@
<!-- Please review https://github.com/overleaf/write_latex/blob/master/.github/CONTRIBUTING.md for guidance on what is expected in each section. -->
### Description
#### Screenshots
#### Related Issues / PRs
### Review
#### Potential Impact
#### Manual Testing Performed
- [ ]
- [ ]
#### Accessibility
### Deployment
#### Deployment Checklist
- [ ] Update documentation not included in the PR (if any)
- [ ]
#### Metrics and Monitoring
#### Who Needs to Know?

5
.gitignore vendored
View File

@@ -7,10 +7,13 @@ test/acceptance/js
test/acceptance/fixtures/tmp
compiles
app.js
**/*.map
.DS_Store
*~
cache
.vagrant
db.sqlite
db.sqlite-wal
db.sqlite-shm
config/*
bin/synctex
npm-debug.log

1
.nvmrc Normal file
View File

@@ -0,0 +1 @@
10.15.0

View File

@@ -1,8 +1,5 @@
language: node_js
node_js:
- "0.10"
before_install:
- npm install -g grunt-cli

35
.viminfo Normal file
View File

@@ -0,0 +1,35 @@
# This viminfo file was generated by Vim 7.4.
# You may edit it if you're careful!
# Value of 'encoding' when this file was written
*encoding=latin1
# hlsearch on (H) or off (h):
~h
# Command Line History (newest to oldest):
:x
# Search String History (newest to oldest):
# Expression History (newest to oldest):
# Input Line History (newest to oldest):
# Input Line History (newest to oldest):
# Registers:
# File marks:
'0 1 0 ~/hello
# Jumplist (newest first):
-' 1 0 ~/hello
# History of marks within files (newest to oldest):
> ~/hello
" 1 0
^ 1 1
. 1 0
+ 1 0

27
Dockerfile Normal file
View File

@@ -0,0 +1,27 @@
FROM node:10.15.0 as app
WORKDIR /app
#wildcard as some files may not be in all repos
COPY package*.json npm-shrink*.json /app/
RUN npm install --quiet
COPY . /app
RUN npm run compile:all
FROM node:10.15.0
RUN \
apt -y update && \
apt -y install moreutils
COPY --from=app /app /app
WORKDIR /app
RUN chmod 0755 ./install_deps.sh && ./install_deps.sh
ENTRYPOINT ["/bin/bash", "entrypoint.sh"]
CMD ["node", "--expose-gc", "app.js"]

View File

@@ -1,98 +0,0 @@
spawn = require("child_process").spawn
module.exports = (grunt) ->
grunt.initConfig
coffee:
app_src:
expand: true,
flatten: true,
cwd: "app"
src: ['coffee/*.coffee'],
dest: 'app/js/',
ext: '.js'
app:
src: "app.coffee"
dest: "app.js"
unit_tests:
expand: true
cwd: "test/unit/coffee"
src: ["**/*.coffee"]
dest: "test/unit/js/"
ext: ".js"
acceptance_tests:
expand: true
cwd: "test/acceptance/coffee"
src: ["**/*.coffee"]
dest: "test/acceptance/js/"
ext: ".js"
smoke_tests:
expand: true
cwd: "test/smoke/coffee"
src: ["**/*.coffee"]
dest: "test/smoke/js"
ext: ".js"
clean:
app: ["app/js/"]
unit_tests: ["test/unit/js"]
acceptance_tests: ["test/acceptance/js"]
smoke_tests: ["test/smoke/js"]
execute:
app:
src: "app.js"
mochaTest:
unit:
options:
reporter: "spec"
grep: grunt.option("grep")
src: ["test/unit/js/**/*.js"]
acceptance:
options:
reporter: "spec"
timeout: 40000
grep: grunt.option("grep")
src: ["test/acceptance/js/**/*.js"]
smoke:
options:
reported: "spec"
timeout: 10000
src: ["test/smoke/js/**/*.js"]
grunt.loadNpmTasks 'grunt-contrib-coffee'
grunt.loadNpmTasks 'grunt-contrib-clean'
grunt.loadNpmTasks 'grunt-mocha-test'
grunt.loadNpmTasks 'grunt-shell'
grunt.loadNpmTasks 'grunt-execute'
grunt.loadNpmTasks 'grunt-bunyan'
grunt.registerTask 'compile:bin', () ->
callback = @async()
proc = spawn "cc", [
"-o", "bin/synctex", "-Isrc/synctex",
"src/synctex.c", "src/synctex/synctex_parser.c", "src/synctex/synctex_parser_utils.c", "-lz"
], stdio: "inherit"
proc.on "close", callback
grunt.registerTask 'compile:app', ['clean:app', 'coffee:app', 'coffee:app_src', 'coffee:smoke_tests', 'compile:bin']
grunt.registerTask 'run', ['compile:app', 'bunyan', 'execute']
grunt.registerTask 'compile:unit_tests', ['clean:unit_tests', 'coffee:unit_tests']
grunt.registerTask 'test:unit', ['compile:app', 'compile:unit_tests', 'mochaTest:unit']
grunt.registerTask 'compile:acceptance_tests', ['clean:acceptance_tests', 'coffee:acceptance_tests']
grunt.registerTask 'test:acceptance', ['compile:acceptance_tests', 'mochaTest:acceptance']
grunt.registerTask 'compile:smoke_tests', ['clean:smoke_tests', 'coffee:smoke_tests']
grunt.registerTask 'test:smoke', ['compile:smoke_tests', 'mochaTest:smoke']
grunt.registerTask 'install', 'compile:app'
grunt.registerTask 'default', ['run']

123
Jenkinsfile vendored Normal file
View File

@@ -0,0 +1,123 @@
String cron_string = BRANCH_NAME == "master" ? "@daily" : ""
pipeline {
agent any
environment {
GIT_PROJECT = "clsi"
JENKINS_WORKFLOW = "clsi-sharelatex"
TARGET_URL = "${env.JENKINS_URL}blue/organizations/jenkins/${JENKINS_WORKFLOW}/detail/$BRANCH_NAME/$BUILD_NUMBER/pipeline"
GIT_API_URL = "https://api.github.com/repos/overleaf/${GIT_PROJECT}/statuses/$GIT_COMMIT"
}
triggers {
pollSCM('* * * * *')
cron(cron_string)
}
stages {
stage('Install') {
steps {
withCredentials([usernamePassword(credentialsId: 'GITHUB_INTEGRATION', usernameVariable: 'GH_AUTH_USERNAME', passwordVariable: 'GH_AUTH_PASSWORD')]) {
sh "curl $GIT_API_URL \
--data '{ \
\"state\" : \"pending\", \
\"target_url\": \"$TARGET_URL\", \
\"description\": \"Your build is underway\", \
\"context\": \"ci/jenkins\" }' \
-u $GH_AUTH_USERNAME:$GH_AUTH_PASSWORD"
}
}
}
stage('Build') {
steps {
sh 'make build'
}
}
stage('Unit Tests') {
steps {
sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make test_unit'
}
}
stage('Acceptance Tests') {
steps {
sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make test_acceptance'
}
}
stage('Package and docker push') {
steps {
sh 'echo ${BUILD_NUMBER} > build_number.txt'
sh 'touch build.tar.gz' // Avoid tar warning about files changing during read
sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make tar'
withCredentials([file(credentialsId: 'gcr.io_overleaf-ops', variable: 'DOCKER_REPO_KEY_PATH')]) {
sh 'docker login -u _json_key --password-stdin https://gcr.io/overleaf-ops < ${DOCKER_REPO_KEY_PATH}'
}
sh 'DOCKER_REPO=gcr.io/overleaf-ops make publish'
sh 'docker logout https://gcr.io/overleaf-ops'
}
}
stage('Publish to s3') {
steps {
sh 'echo ${BRANCH_NAME}-${BUILD_NUMBER} > build_number.txt'
withAWS(credentials:'S3_CI_BUILDS_AWS_KEYS', region:"${S3_REGION_BUILD_ARTEFACTS}") {
s3Upload(file:'build.tar.gz', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/${BUILD_NUMBER}.tar.gz")
}
withAWS(credentials:'S3_CI_BUILDS_AWS_KEYS', region:"${S3_REGION_BUILD_ARTEFACTS}") {
// The deployment process uses this file to figure out the latest build
s3Upload(file:'build_number.txt', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/latest")
}
}
}
}
post {
always {
sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make test_clean'
sh 'make clean'
}
success {
withCredentials([usernamePassword(credentialsId: 'GITHUB_INTEGRATION', usernameVariable: 'GH_AUTH_USERNAME', passwordVariable: 'GH_AUTH_PASSWORD')]) {
sh "curl $GIT_API_URL \
--data '{ \
\"state\" : \"success\", \
\"target_url\": \"$TARGET_URL\", \
\"description\": \"Your build succeeded!\", \
\"context\": \"ci/jenkins\" }' \
-u $GH_AUTH_USERNAME:$GH_AUTH_PASSWORD"
}
}
failure {
mail(from: "${EMAIL_ALERT_FROM}",
to: "${EMAIL_ALERT_TO}",
subject: "Jenkins build failed: ${JOB_NAME}:${BUILD_NUMBER}",
body: "Build: ${BUILD_URL}")
withCredentials([usernamePassword(credentialsId: 'GITHUB_INTEGRATION', usernameVariable: 'GH_AUTH_USERNAME', passwordVariable: 'GH_AUTH_PASSWORD')]) {
sh "curl $GIT_API_URL \
--data '{ \
\"state\" : \"failure\", \
\"target_url\": \"$TARGET_URL\", \
\"description\": \"Your build failed\", \
\"context\": \"ci/jenkins\" }' \
-u $GH_AUTH_USERNAME:$GH_AUTH_PASSWORD"
}
}
}
// The options directive is for configuration that applies to the whole job.
options {
// we'd like to make sure remove old builds, so we don't fill up our storage!
buildDiscarder(logRotator(numToKeepStr:'50'))
// And we'd really like to be sure that this build doesn't hang forever, so let's time it out after:
timeout(time: 30, unit: 'MINUTES')
}
}

51
Makefile Normal file
View File

@@ -0,0 +1,51 @@
# This file was auto-generated, do not edit it directly.
# Instead run bin/update_build_scripts from
# https://github.com/sharelatex/sharelatex-dev-environment
# Version: 1.1.22
BUILD_NUMBER ?= local
BRANCH_NAME ?= $(shell git rev-parse --abbrev-ref HEAD)
PROJECT_NAME = clsi
DOCKER_COMPOSE_FLAGS ?= -f docker-compose.yml
DOCKER_COMPOSE := BUILD_NUMBER=$(BUILD_NUMBER) \
BRANCH_NAME=$(BRANCH_NAME) \
PROJECT_NAME=$(PROJECT_NAME) \
MOCHA_GREP=${MOCHA_GREP} \
docker-compose ${DOCKER_COMPOSE_FLAGS}
clean:
docker rmi ci/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
docker rmi gcr.io/overleaf-ops/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
rm -f app.js
rm -rf app/js
rm -rf test/unit/js
rm -rf test/acceptance/js
test: test_unit test_acceptance
test_unit:
@[ ! -d test/unit ] && echo "clsi has no unit tests" || $(DOCKER_COMPOSE) run --rm test_unit
test_acceptance: test_clean test_acceptance_pre_run test_acceptance_run
test_acceptance_run:
@[ ! -d test/acceptance ] && echo "clsi has no acceptance tests" || $(DOCKER_COMPOSE) run --rm test_acceptance
test_clean:
$(DOCKER_COMPOSE) down -v -t 0
test_acceptance_pre_run:
@[ ! -f test/acceptance/js/scripts/pre-run ] && echo "clsi has no pre acceptance tests task" || $(DOCKER_COMPOSE) run --rm test_acceptance test/acceptance/js/scripts/pre-run
build:
docker build --pull --tag ci/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER) \
--tag gcr.io/overleaf-ops/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER) \
.
tar:
$(DOCKER_COMPOSE) up tar
publish:
docker push $(DOCKER_REPO)/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
.PHONY: clean test test_unit test_acceptance test_clean build publish

View File

@@ -1,16 +1,38 @@
clsi-sharelatex
overleaf/clsi
===============
A web api for compiling LaTeX documents in the cloud
[![Build Status](https://travis-ci.org/sharelatex/clsi-sharelatex.png?branch=master)](https://travis-ci.org/sharelatex/clsi-sharelatex)
The Common LaTeX Service Interface (CLSI) provides a RESTful interface to traditional LaTeX tools (or, more generally, any command line tool for composing marked-up documents into a display format such as PDF or HTML). The CLSI listens on the following ports by default:
* TCP/3009 - the RESTful interface
* TCP/3048 - reports load information
* TCP/3049 - HTTP interface to control the CLSI service
These defaults can be modified in `config/settings.defaults.coffee`.
The provided `Dockerfile` builds a docker image which has the docker command line tools installed. The configuration in `docker-compose-config.yml` mounts the docker socket, in order that the CLSI container can talk to the docker host it is running in. This allows it to spin up `sibling containers` running an image with a TeX distribution installed to perform the actual compiles.
The CLSI can be configured through the following environment variables:
* `DOCKER_RUNNER` - Set to true to use sibling containers
* `SYNCTEX_BIN_HOST_PATH` - Path to SyncTeX binary
* `COMPILES_HOST_DIR` - Working directory for LaTeX compiles
* `SQLITE_PATH` - Path to SQLite database
* `TEXLIVE_IMAGE` - The TEXLIVE docker image to use for sibling containers, e.g. `gcr.io/overleaf-ops/texlive-full:2017.1`
* `TEXLIVE_IMAGE_USER` - When using sibling containers, the user to run as in the TEXLIVE image. Defaults to `tex`
* `TEX_LIVE_IMAGE_NAME_OVERRIDE` - The name of the registry for the docker image e.g. `gcr.io/overleaf-ops`
* `FILESTORE_DOMAIN_OVERRIDE` - The url for the filestore service e.g.`http://$FILESTORE_HOST:3009`
* `STATSD_HOST` - The address of the Statsd service (used by the metrics module)
* `LISTEN_ADDRESS` - The address for the RESTful service to listen on. Set to `0.0.0.0` to listen on all network interfaces
* `SMOKE_TEST` - Whether to run smoke tests
Installation
------------
The CLSI can be installed and set up as part of the entire [ShareLaTeX stack](https://github.com/sharelatex/sharelatex) (complete with front end editor and document storage), or it can be run as a standalone service. To run is as a standalone service, first checkout this repository:
The CLSI can be installed and set up as part of the entire [Overleaf stack](https://github.com/overleaf/overleaf) (complete with front end editor and document storage), or it can be run as a standalone service. To run is as a standalone service, first checkout this repository:
$ git clone git@github.com:sharelatex/clsi-sharelatex.git
$ git clone git@github.com:overleaf/clsi.git
Then install the require npm modules:
@@ -92,4 +114,4 @@ License
The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3. A copy can be found in the `LICENSE` file.
Copyright (c) ShareLaTeX, 2014.
Copyright (c) Overleaf, 2014-2019.

View File

@@ -1,14 +1,21 @@
Metrics = require "metrics-sharelatex"
Metrics.initialize("clsi")
CompileController = require "./app/js/CompileController"
Settings = require "settings-sharelatex"
logger = require "logger-sharelatex"
logger.initialize("clsi")
if Settings.sentry?.dsn?
logger.initializeErrorReporting(Settings.sentry.dsn)
smokeTest = require "smoke-test-sharelatex"
ContentTypeMapper = require "./app/js/ContentTypeMapper"
Errors = require './app/js/Errors'
Path = require "path"
fs = require "fs"
Metrics = require "metrics-sharelatex"
Metrics.initialize("clsi")
Metrics.open_sockets.monitor(logger)
Metrics.memory.monitor(logger)
@@ -21,23 +28,55 @@ express = require "express"
bodyParser = require "body-parser"
app = express()
Metrics.injectMetricsRoute(app)
app.use Metrics.http.monitor(logger)
# Compile requests can take longer than the default two
# minutes (including file download time), so bump up the
# timeout a bit.
TIMEOUT = 6 * 60 * 1000
TIMEOUT = 10 * 60 * 1000
app.use (req, res, next) ->
req.setTimeout TIMEOUT
res.setTimeout TIMEOUT
res.removeHeader("X-Powered-By")
next()
app.post "/project/:project_id/compile", bodyParser.json(limit: "5mb"), CompileController.compile
app.param 'project_id', (req, res, next, project_id) ->
if project_id?.match /^[a-zA-Z0-9_-]+$/
next()
else
next new Error("invalid project id")
app.param 'user_id', (req, res, next, user_id) ->
if user_id?.match /^[0-9a-f]{24}$/
next()
else
next new Error("invalid user id")
app.param 'build_id', (req, res, next, build_id) ->
if build_id?.match OutputCacheManager.BUILD_REGEX
next()
else
next new Error("invalid build id #{build_id}")
app.post "/project/:project_id/compile", bodyParser.json(limit: Settings.compileSizeLimit), CompileController.compile
app.post "/project/:project_id/compile/stop", CompileController.stopCompile
app.delete "/project/:project_id", CompileController.clearCache
app.get "/project/:project_id/sync/code", CompileController.syncFromCode
app.get "/project/:project_id/sync/pdf", CompileController.syncFromPdf
app.get "/project/:project_id/wordcount", CompileController.wordcount
app.get "/project/:project_id/status", CompileController.status
# Per-user containers
app.post "/project/:project_id/user/:user_id/compile", bodyParser.json(limit: Settings.compileSizeLimit), CompileController.compile
app.post "/project/:project_id/user/:user_id/compile/stop", CompileController.stopCompile
app.delete "/project/:project_id/user/:user_id", CompileController.clearCache
app.get "/project/:project_id/user/:user_id/sync/code", CompileController.syncFromCode
app.get "/project/:project_id/user/:user_id/sync/pdf", CompileController.syncFromPdf
app.get "/project/:project_id/user/:user_id/wordcount", CompileController.wordcount
ForbidSymlinks = require "./app/js/StaticServerForbidSymlinks"
@@ -46,17 +85,28 @@ ForbidSymlinks = require "./app/js/StaticServerForbidSymlinks"
# and serving the files
staticServer = ForbidSymlinks express.static, Settings.path.compilesDir, setHeaders: (res, path, stat) ->
if Path.basename(path) == "output.pdf"
res.set("Content-Type", "application/pdf")
# Calculate an etag in the same way as nginx
# https://github.com/tj/send/issues/65
etag = (path, stat) ->
'"' + Math.ceil(+stat.mtime / 1000).toString(16) +
'-' + Number(stat.size).toString(16) + '"'
res.set("Etag", etag(path, stat))
else
# Force plain treatment of other file types to prevent hosting of HTTP/JS files
# that could be used in same-origin/XSS attacks.
res.set("Content-Type", "text/plain")
res.set("Content-Type", ContentTypeMapper.map(path))
app.get "/project/:project_id/user/:user_id/build/:build_id/output/*", (req, res, next) ->
# for specific build get the path from the OutputCacheManager (e.g. .clsi/buildId)
req.url = "/#{req.params.project_id}-#{req.params.user_id}/" + OutputCacheManager.path(req.params.build_id, "/#{req.params[0]}")
staticServer(req, res, next)
app.get "/project/:project_id/build/:build_id/output/*", (req, res, next) ->
# for specific build get the path from the OutputCacheManager (e.g. .clsi/buildId)
req.url = "/#{req.params.project_id}/" + OutputCacheManager.path(req.params.build_id, "/#{req.params[0]}")
staticServer(req, res, next)
app.get "/project/:project_id/user/:user_id/output/*", (req, res, next) ->
# for specific user get the path to the top level file
req.url = "/#{req.params.project_id}-#{req.params.user_id}/#{req.params[0]}"
staticServer(req, res, next)
app.get "/project/:project_id/output/*", (req, res, next) ->
if req.query?.build? && req.query.build.match(OutputCacheManager.BUILD_REGEX)
@@ -66,6 +116,11 @@ app.get "/project/:project_id/output/*", (req, res, next) ->
req.url = "/#{req.params.project_id}/#{req.params[0]}"
staticServer(req, res, next)
app.get "/oops", (req, res, next) ->
logger.error {err: "hello"}, "test error"
res.send "error\n"
app.get "/status", (req, res, next) ->
res.send "CLSI is alive\n"
@@ -82,13 +137,16 @@ if Settings.smokeTest
do runSmokeTest = ->
logger.log("running smoke tests")
smokeTest.run(require.resolve(__dirname + "/test/smoke/js/SmokeTests.js"))({}, resCacher)
setTimeout(runSmokeTest, 20 * 1000)
setTimeout(runSmokeTest, 30 * 1000)
app.get "/health_check", (req, res)->
res.contentType(resCacher?.setContentType)
res.status(resCacher?.code).send(resCacher?.body)
profiler = require "v8-profiler"
app.get "/smoke_test_force", (req, res)->
smokeTest.run(require.resolve(__dirname + "/test/smoke/js/SmokeTests.js"))(req, res)
profiler = require "v8-profiler-node8"
app.get "/profile", (req, res) ->
time = parseInt(req.query.time || "1000")
profiler.startProfiling("test")
@@ -102,12 +160,85 @@ app.get "/heapdump", (req, res)->
res.send filename
app.use (error, req, res, next) ->
logger.error err: error, "server error"
res.sendStatus(error?.statusCode || 500)
if error instanceof Errors.NotFoundError
logger.warn {err: error, url: req.url}, "not found error"
return res.sendStatus(404)
else
logger.error {err: error, url: req.url}, "server error"
res.sendStatus(error?.statusCode || 500)
app.listen port = (Settings.internal?.clsi?.port or 3013), host = (Settings.internal?.clsi?.host or "localhost"), (error) ->
logger.info "CLSI starting up, listening on #{host}:#{port}"
net = require "net"
os = require "os"
STATE = "up"
loadTcpServer = net.createServer (socket) ->
socket.on "error", (err)->
if err.code == "ECONNRESET"
# this always comes up, we don't know why
return
logger.err err:err, "error with socket on load check"
socket.destroy()
if STATE == "up" and Settings.internal.load_balancer_agent.report_load
currentLoad = os.loadavg()[0]
# staging clis's have 1 cpu core only
if os.cpus().length == 1
availableWorkingCpus = 1
else
availableWorkingCpus = os.cpus().length - 1
freeLoad = availableWorkingCpus - currentLoad
freeLoadPercentage = Math.round((freeLoad / availableWorkingCpus) * 100)
if freeLoadPercentage <= 0
freeLoadPercentage = 1 # when its 0 the server is set to drain and will move projects to different servers
socket.write("up, #{freeLoadPercentage}%\n", "ASCII")
socket.end()
else
socket.write("#{STATE}\n", "ASCII")
socket.end()
loadHttpServer = express()
loadHttpServer.post "/state/up", (req, res, next) ->
STATE = "up"
logger.info "getting message to set server to down"
res.sendStatus 204
loadHttpServer.post "/state/down", (req, res, next) ->
STATE = "down"
logger.info "getting message to set server to down"
res.sendStatus 204
loadHttpServer.post "/state/maint", (req, res, next) ->
STATE = "maint"
logger.info "getting message to set server to maint"
res.sendStatus 204
port = (Settings.internal?.clsi?.port or 3013)
host = (Settings.internal?.clsi?.host or "localhost")
load_tcp_port = Settings.internal.load_balancer_agent.load_port
load_http_port = Settings.internal.load_balancer_agent.local_port
if !module.parent # Called directly
app.listen port, host, (error) ->
logger.info "CLSI starting up, listening on #{host}:#{port}"
loadTcpServer.listen load_tcp_port, host, (error) ->
throw error if error?
logger.info "Load tcp agent listening on load port #{load_tcp_port}"
loadHttpServer.listen load_http_port, host, (error) ->
throw error if error?
logger.info "Load http agent listening on load port #{load_http_port}"
module.exports = app
setInterval () ->
ProjectPersistenceManager.clearExpiredProjects()
, tenMinutes = 10 * 60 * 1000

View File

@@ -1,12 +1,11 @@
spawn = require("child_process").spawn
Settings = require "settings-sharelatex"
logger = require "logger-sharelatex"
module.exports = CommandRunner =
run: (project_id, command, directory, timeout, callback = (error) ->) ->
command = (arg.replace('$COMPILE_DIR', directory) for arg in command)
logger.log project_id: project_id, command: command, directory: directory, "running command"
logger.warn "timeouts and sandboxing are not enabled with CommandRunner"
if Settings.clsi?.dockerRunner == true
commandRunnerPath = "./DockerRunner"
else
commandRunnerPath = "./LocalCommandRunner"
logger.info commandRunnerPath:commandRunnerPath, "selecting command runner for clsi"
CommandRunner = require(commandRunnerPath)
proc = spawn command[0], command.slice(1), stdio: "inherit", cwd: directory
proc.on "close", () ->
callback()
module.exports = CommandRunner

View File

@@ -4,6 +4,7 @@ Settings = require "settings-sharelatex"
Metrics = require "./Metrics"
ProjectPersistenceManager = require "./ProjectPersistenceManager"
logger = require "logger-sharelatex"
Errors = require "./Errors"
module.exports = CompileController =
compile: (req, res, next = (error) ->) ->
@@ -11,35 +12,67 @@ module.exports = CompileController =
RequestParser.parse req.body, (error, request) ->
return next(error) if error?
request.project_id = req.params.project_id
request.user_id = req.params.user_id if req.params.user_id?
ProjectPersistenceManager.markProjectAsJustAccessed request.project_id, (error) ->
return next(error) if error?
CompileManager.doCompile request, (error, outputFiles = []) ->
if error?
logger.error err: error, project_id: request.project_id, "error running compile"
CompileManager.doCompileWithLock request, (error, outputFiles = []) ->
if error instanceof Errors.AlreadyCompilingError
code = 423 # Http 423 Locked
status = "compile-in-progress"
else if error instanceof Errors.FilesOutOfSyncError
code = 409 # Http 409 Conflict
status = "retry"
else if error?.terminated
status = "terminated"
else if error?.validate
status = "validation-#{error.validate}"
else if error?
if error.timedout
status = "timedout"
logger.log err: error, project_id: request.project_id, "timeout running compile"
else
status = "error"
code = 500
logger.warn err: error, project_id: request.project_id, "error running compile"
else
status = "failure"
for file in outputFiles
if file.path?.match(/output\.pdf$/)
status = "success"
if status == "failure"
logger.warn project_id: request.project_id, outputFiles:outputFiles, "project failed to compile successfully, no output.pdf generated"
# log an error if any core files are found
for file in outputFiles
if file.path is "core"
logger.error project_id:request.project_id, req:req, outputFiles:outputFiles, "core file found in output"
timer.done()
res.status(code or 200).send {
compile:
status: status
error: error?.message or error
outputFiles: outputFiles.map (file) ->
url: "#{Settings.apis.clsi.url}/project/#{request.project_id}/output/#{file.path}"
url:
"#{Settings.apis.clsi.url}/project/#{request.project_id}" +
(if request.user_id? then "/user/#{request.user_id}" else "") +
(if file.build? then "/build/#{file.build}" else "") +
"/output/#{file.path}"
path: file.path
type: file.type
build: file.build
}
stopCompile: (req, res, next) ->
{project_id, user_id} = req.params
CompileManager.stopCompile project_id, user_id, (error) ->
return next(error) if error?
res.sendStatus(204)
clearCache: (req, res, next = (error) ->) ->
ProjectPersistenceManager.clearProject req.params.project_id, (error) ->
ProjectPersistenceManager.clearProject req.params.project_id, req.params.user_id, (error) ->
return next(error) if error?
res.sendStatus(204) # No content
@@ -48,10 +81,10 @@ module.exports = CompileController =
line = parseInt(req.query.line, 10)
column = parseInt(req.query.column, 10)
project_id = req.params.project_id
CompileManager.syncFromCode project_id, file, line, column, (error, pdfPositions) ->
user_id = req.params.user_id
CompileManager.syncFromCode project_id, user_id, file, line, column, (error, pdfPositions) ->
return next(error) if error?
res.send JSON.stringify {
res.json {
pdf: pdfPositions
}
@@ -60,19 +93,26 @@ module.exports = CompileController =
h = parseFloat(req.query.h)
v = parseFloat(req.query.v)
project_id = req.params.project_id
CompileManager.syncFromPdf project_id, page, h, v, (error, codePositions) ->
user_id = req.params.user_id
CompileManager.syncFromPdf project_id, user_id, page, h, v, (error, codePositions) ->
return next(error) if error?
res.send JSON.stringify {
res.json {
code: codePositions
}
wordcount: (req, res, next = (error) ->) ->
file = req.query.file || "main.tex"
project_id = req.params.project_id
user_id = req.params.user_id
image = req.query.image
logger.log {image, file, project_id}, "word count request"
CompileManager.wordcount project_id, file, (error, result) ->
CompileManager.wordcount project_id, user_id, file, image, (error, result) ->
return next(error) if error?
res.send JSON.stringify {
res.json {
texcount: result
}
status: (req, res, next = (error)-> )->
res.send("OK")

View File

@@ -7,82 +7,254 @@ Path = require "path"
logger = require "logger-sharelatex"
Metrics = require "./Metrics"
child_process = require "child_process"
CommandRunner = require(Settings.clsi?.commandRunner or "./CommandRunner")
DraftModeManager = require "./DraftModeManager"
TikzManager = require "./TikzManager"
LockManager = require "./LockManager"
fs = require("fs")
fse = require "fs-extra"
os = require("os")
async = require "async"
Errors = require './Errors'
CommandRunner = require "./CommandRunner"
getCompileName = (project_id, user_id) ->
if user_id? then "#{project_id}-#{user_id}" else project_id
getCompileDir = (project_id, user_id) ->
Path.join(Settings.path.compilesDir, getCompileName(project_id, user_id))
module.exports = CompileManager =
doCompile: (request, callback = (error, outputFiles) ->) ->
compileDir = Path.join(Settings.path.compilesDir, request.project_id)
timer = new Metrics.Timer("write-to-disk")
logger.log project_id: request.project_id, "starting compile"
ResourceWriter.syncResourcesToDisk request.project_id, request.resources, compileDir, (error) ->
doCompileWithLock: (request, callback = (error, outputFiles) ->) ->
compileDir = getCompileDir(request.project_id, request.user_id)
lockFile = Path.join(compileDir, ".project-lock")
# use a .project-lock file in the compile directory to prevent
# simultaneous compiles
fse.ensureDir compileDir, (error) ->
return callback(error) if error?
logger.log project_id: request.project_id, time_taken: Date.now() - timer.start, "written files to disk"
LockManager.runWithLock lockFile, (releaseLock) ->
CompileManager.doCompile(request, releaseLock)
, callback
doCompile: (request, callback = (error, outputFiles) ->) ->
compileDir = getCompileDir(request.project_id, request.user_id)
timer = new Metrics.Timer("write-to-disk")
logger.log project_id: request.project_id, user_id: request.user_id, "syncing resources to disk"
ResourceWriter.syncResourcesToDisk request, compileDir, (error, resourceList) ->
# NOTE: resourceList is insecure, it should only be used to exclude files from the output list
if error? and error instanceof Errors.FilesOutOfSyncError
logger.warn project_id: request.project_id, user_id: request.user_id, "files out of sync, please retry"
return callback(error)
else if error?
logger.err err:error, project_id: request.project_id, user_id: request.user_id, "error writing resources to disk"
return callback(error)
logger.log project_id: request.project_id, user_id: request.user_id, time_taken: Date.now() - timer.start, "written files to disk"
timer.done()
timer = new Metrics.Timer("run-compile")
Metrics.inc("compiles")
LatexRunner.runLatex request.project_id, {
directory: compileDir
mainFile: request.rootResourcePath
compiler: request.compiler
timeout: request.timeout
}, (error) ->
return callback(error) if error?
logger.log project_id: request.project_id, time_taken: Date.now() - timer.start, "done compile"
timer.done()
injectDraftModeIfRequired = (callback) ->
if request.draft
DraftModeManager.injectDraftMode Path.join(compileDir, request.rootResourcePath), callback
else
callback()
OutputFileFinder.findOutputFiles request.resources, compileDir, (error, outputFiles) ->
createTikzFileIfRequired = (callback) ->
TikzManager.checkMainFile compileDir, request.rootResourcePath, resourceList, (error, needsMainFile) ->
return callback(error) if error?
OutputCacheManager.saveOutputFiles outputFiles, compileDir, (error, newOutputFiles) ->
callback null, newOutputFiles
if needsMainFile
TikzManager.injectOutputFile compileDir, request.rootResourcePath, callback
else
callback()
clearProject: (project_id, _callback = (error) ->) ->
# set up environment variables for chktex
env = {}
# only run chktex on LaTeX files (not knitr .Rtex files or any others)
isLaTeXFile = request.rootResourcePath?.match(/\.tex$/i)
if request.check? and isLaTeXFile
env['CHKTEX_OPTIONS'] = '-nall -e9 -e10 -w15 -w16'
env['CHKTEX_ULIMIT_OPTIONS'] = '-t 5 -v 64000'
if request.check is 'error'
env['CHKTEX_EXIT_ON_ERROR'] = 1
if request.check is 'validate'
env['CHKTEX_VALIDATE'] = 1
# apply a series of file modifications/creations for draft mode and tikz
async.series [injectDraftModeIfRequired, createTikzFileIfRequired], (error) ->
return callback(error) if error?
timer = new Metrics.Timer("run-compile")
# find the image tag to log it as a metric, e.g. 2015.1 (convert . to - for graphite)
tag = request.imageName?.match(/:(.*)/)?[1]?.replace(/\./g,'-') or "default"
tag = "other" if not request.project_id.match(/^[0-9a-f]{24}$/) # exclude smoke test
Metrics.inc("compiles")
Metrics.inc("compiles-with-image.#{tag}")
compileName = getCompileName(request.project_id, request.user_id)
LatexRunner.runLatex compileName, {
directory: compileDir
mainFile: request.rootResourcePath
compiler: request.compiler
timeout: request.timeout
image: request.imageName
flags: request.flags
environment: env
}, (error, output, stats, timings) ->
# request was for validation only
if request.check is "validate"
result = if error?.code then "fail" else "pass"
error = new Error("validation")
error.validate = result
# request was for compile, and failed on validation
if request.check is "error" and error?.message is 'exited'
error = new Error("compilation")
error.validate = "fail"
# compile was killed by user, was a validation, or a compile which failed validation
if error?.terminated or error?.validate
OutputFileFinder.findOutputFiles resourceList, compileDir, (err, outputFiles) ->
return callback(err) if err?
callback(error, outputFiles) # return output files so user can check logs
return
# compile completed normally
return callback(error) if error?
Metrics.inc("compiles-succeeded")
for metric_key, metric_value of stats or {}
Metrics.count(metric_key, metric_value)
for metric_key, metric_value of timings or {}
Metrics.timing(metric_key, metric_value)
loadavg = os.loadavg?()
Metrics.gauge("load-avg", loadavg[0]) if loadavg?
ts = timer.done()
logger.log {project_id: request.project_id, user_id: request.user_id, time_taken: ts, stats:stats, timings:timings, loadavg:loadavg}, "done compile"
if stats?["latex-runs"] > 0
Metrics.timing("run-compile-per-pass", ts / stats["latex-runs"])
if stats?["latex-runs"] > 0 and timings?["cpu-time"] > 0
Metrics.timing("run-compile-cpu-time-per-pass", timings["cpu-time"] / stats["latex-runs"])
OutputFileFinder.findOutputFiles resourceList, compileDir, (error, outputFiles) ->
return callback(error) if error?
OutputCacheManager.saveOutputFiles outputFiles, compileDir, (error, newOutputFiles) ->
callback null, newOutputFiles
stopCompile: (project_id, user_id, callback = (error) ->) ->
compileName = getCompileName(project_id, user_id)
LatexRunner.killLatex compileName, callback
clearProject: (project_id, user_id, _callback = (error) ->) ->
callback = (error) ->
_callback(error)
_callback = () ->
compileDir = Path.join(Settings.path.compilesDir, project_id)
proc = child_process.spawn "rm", ["-r", compileDir]
compileDir = getCompileDir(project_id, user_id)
proc.on "error", callback
CompileManager._checkDirectory compileDir, (err, exists) ->
return callback(err) if err?
return callback() if not exists # skip removal if no directory present
stderr = ""
proc.stderr.on "data", (chunk) -> stderr += chunk.toString()
proc = child_process.spawn "rm", ["-r", compileDir]
proc.on "close", (code) ->
if code == 0
return callback(null)
proc.on "error", callback
stderr = ""
proc.stderr.on "data", (chunk) -> stderr += chunk.toString()
proc.on "close", (code) ->
if code == 0
return callback(null)
else
return callback(new Error("rm -r #{compileDir} failed: #{stderr}"))
_findAllDirs: (callback = (error, allDirs) ->) ->
root = Settings.path.compilesDir
fs.readdir root, (err, files) ->
return callback(err) if err?
allDirs = (Path.join(root, file) for file in files)
callback(null, allDirs)
clearExpiredProjects: (max_cache_age_ms, callback = (error) ->) ->
now = Date.now()
# action for each directory
expireIfNeeded = (checkDir, cb) ->
fs.stat checkDir, (err, stats) ->
return cb() if err? # ignore errors checking directory
age = now - stats.mtime
hasExpired = (age > max_cache_age_ms)
if hasExpired then fse.remove(checkDir, cb) else cb()
# iterate over all project directories
CompileManager._findAllDirs (error, allDirs) ->
return callback() if error?
async.eachSeries allDirs, expireIfNeeded, callback
_checkDirectory: (compileDir, callback = (error, exists) ->) ->
fs.lstat compileDir, (err, stats) ->
if err?.code is 'ENOENT'
return callback(null, false) # directory does not exist
else if err?
logger.err {dir: compileDir, err:err}, "error on stat of project directory for removal"
return callback(err)
else if not stats?.isDirectory()
logger.err {dir: compileDir, stats:stats}, "bad project directory for removal"
return callback new Error("project directory is not directory")
else
return callback(new Error("rm -r #{compileDir} failed: #{stderr}"))
callback(null, true) # directory exists
syncFromCode: (project_id, file_name, line, column, callback = (error, pdfPositions) ->) ->
syncFromCode: (project_id, user_id, file_name, line, column, callback = (error, pdfPositions) ->) ->
# If LaTeX was run in a virtual environment, the file path that synctex expects
# might not match the file path on the host. The .synctex.gz file however, will be accessed
# wherever it is on the host.
base_dir = Settings.path.synctexBaseDir(project_id)
compileName = getCompileName(project_id, user_id)
base_dir = Settings.path.synctexBaseDir(compileName)
file_path = base_dir + "/" + file_name
synctex_path = Path.join(Settings.path.compilesDir, project_id, "output.pdf")
CompileManager._runSynctex ["code", synctex_path, file_path, line, column], (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, file_name: file_name, line: line, column: column, stdout: stdout, "synctex code output"
callback null, CompileManager._parseSynctexFromCodeOutput(stdout)
compileDir = getCompileDir(project_id, user_id)
synctex_path = "#{base_dir}/output.pdf"
command = ["code", synctex_path, file_path, line, column]
fse.ensureDir compileDir, (error) ->
if error?
logger.err {error, project_id, user_id, file_name}, "error ensuring dir for sync from code"
return callback(error)
CompileManager._runSynctex project_id, user_id, command, (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, user_id:user_id, file_name: file_name, line: line, column: column, command:command, stdout: stdout, "synctex code output"
callback null, CompileManager._parseSynctexFromCodeOutput(stdout)
syncFromPdf: (project_id, page, h, v, callback = (error, filePositions) ->) ->
base_dir = Settings.path.synctexBaseDir(project_id)
synctex_path = Path.join(Settings.path.compilesDir, project_id, "output.pdf")
CompileManager._runSynctex ["pdf", synctex_path, page, h, v], (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, page: page, h: h, v:v, stdout: stdout, "synctex pdf output"
callback null, CompileManager._parseSynctexFromPdfOutput(stdout, base_dir)
syncFromPdf: (project_id, user_id, page, h, v, callback = (error, filePositions) ->) ->
compileName = getCompileName(project_id, user_id)
compileDir = getCompileDir(project_id, user_id)
base_dir = Settings.path.synctexBaseDir(compileName)
synctex_path = "#{base_dir}/output.pdf"
command = ["pdf", synctex_path, page, h, v]
fse.ensureDir compileDir, (error) ->
if error?
logger.err {error, project_id, user_id, file_name}, "error ensuring dir for sync to code"
return callback(error)
CompileManager._runSynctex project_id, user_id, command, (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, user_id:user_id, page: page, h: h, v:v, stdout: stdout, "synctex pdf output"
callback null, CompileManager._parseSynctexFromPdfOutput(stdout, base_dir)
_runSynctex: (args, callback = (error, stdout) ->) ->
bin_path = Path.resolve(__dirname + "/../../bin/synctex")
_checkFileExists: (path, callback = (error) ->) ->
synctexDir = Path.dirname(path)
synctexFile = Path.join(synctexDir, "output.synctex.gz")
fs.stat synctexDir, (error, stats) ->
if error?.code is 'ENOENT'
return callback(new Errors.NotFoundError("called synctex with no output directory"))
return callback(error) if error?
fs.stat synctexFile, (error, stats) ->
if error?.code is 'ENOENT'
return callback(new Errors.NotFoundError("called synctex with no output file"))
return callback(error) if error?
return callback(new Error("not a file")) if not stats?.isFile()
callback()
_runSynctex: (project_id, user_id, command, callback = (error, stdout) ->) ->
seconds = 1000
child_process.execFile bin_path, args, timeout: 10 * seconds, (error, stdout, stderr) ->
return callback(error) if error?
callback(null, stdout)
command.unshift("/opt/synctex")
directory = getCompileDir(project_id, user_id)
timeout = 60 * 1000 # increased to allow for large projects
compileName = getCompileName(project_id, user_id)
CommandRunner.run compileName, command, directory, Settings.clsi?.docker.image, timeout, {}, (error, output) ->
if error?
logger.err err:error, command:command, project_id:project_id, user_id:user_id, "error running synctex"
return callback(error)
callback(null, output.stdout)
_parseSynctexFromCodeOutput: (output) ->
results = []
@@ -111,17 +283,28 @@ module.exports = CompileManager =
}
return results
wordcount: (project_id, file_name, callback = (error, pdfPositions) ->) ->
logger.log project_id:project_id, file_name:file_name, "running wordcount"
file_path = "$COMPILE_DIR/" + file_name
command = [ "texcount", '-inc', file_path, "-out=" + file_path + ".wc"]
directory = Path.join(Settings.path.compilesDir, project_id)
timeout = 10 * 1000
CommandRunner.run project_id, command, directory, timeout, (error) ->
return callback(error) if error?
stdout = fs.readFileSync(directory + "/" + file_name + ".wc", "utf-8")
callback null, CompileManager._parseWordcountFromOutput(stdout)
wordcount: (project_id, user_id, file_name, image, callback = (error, pdfPositions) ->) ->
logger.log project_id:project_id, user_id:user_id, file_name:file_name, image:image, "running wordcount"
file_path = "$COMPILE_DIR/" + file_name
command = [ "texcount", '-nocol', '-inc', file_path, "-out=" + file_path + ".wc"]
compileDir = getCompileDir(project_id, user_id)
timeout = 60 * 1000
compileName = getCompileName(project_id, user_id)
fse.ensureDir compileDir, (error) ->
if error?
logger.err {error, project_id, user_id, file_name}, "error ensuring dir for sync from code"
return callback(error)
CommandRunner.run compileName, command, compileDir, image, timeout, {}, (error) ->
return callback(error) if error?
fs.readFile compileDir + "/" + file_name + ".wc", "utf-8", (err, stdout) ->
if err?
#call it node_err so sentry doesn't use random path error as unique id so it can't be ignored
logger.err node_err:err, command:command, compileDir:compileDir, project_id:project_id, user_id:user_id, "error reading word count output"
return callback(err)
results = CompileManager._parseWordcountFromOutput(stdout)
logger.log project_id:project_id, user_id:user_id, wordcount: results, "word count results"
callback null, results
_parseWordcountFromOutput: (output) ->
results = {
@@ -133,6 +316,8 @@ module.exports = CompileManager =
elements: 0
mathInline: 0
mathDisplay: 0
errors: 0
messages: ""
}
for line in output.split("\n")
[data, info] = line.split(":")
@@ -152,4 +337,8 @@ module.exports = CompileManager =
results['mathInline'] = parseInt(info, 10)
if data.indexOf("Number of math displayed") > -1
results['mathDisplay'] = parseInt(info, 10)
if data is "(errors" # errors reported as (errors:123)
results['errors'] = parseInt(info, 10)
if line.indexOf("!!! ") > -1 # errors logged as !!! message !!!
results['messages'] += line + "\n"
return results

View File

@@ -0,0 +1,24 @@
Path = require 'path'
# here we coerce html, css and js to text/plain,
# otherwise choose correct mime type based on file extension,
# falling back to octet-stream
module.exports = ContentTypeMapper =
map: (path) ->
switch Path.extname(path)
when '.txt', '.html', '.js', '.css', '.svg'
return 'text/plain'
when '.csv'
return 'text/csv'
when '.pdf'
return 'application/pdf'
when '.png'
return 'image/png'
when '.jpg', '.jpeg'
return 'image/jpeg'
when '.tiff'
return 'image/tiff'
when '.gif'
return 'image/gif'
else
return 'application/octet-stream'

13
app/coffee/DbQueue.coffee Normal file
View File

@@ -0,0 +1,13 @@
async = require "async"
Settings = require "settings-sharelatex"
logger = require("logger-sharelatex")
queue = async.queue((task, cb)->
task(cb)
, Settings.parallelSqlQueryLimit)
queue.drain = ()->
logger.debug('all items have been processed')
module.exports =
queue: queue

View File

@@ -0,0 +1,56 @@
logger = require "logger-sharelatex"
LockState = {} # locks for docker container operations, by container name
module.exports = LockManager =
MAX_LOCK_HOLD_TIME: 15000 # how long we can keep a lock
MAX_LOCK_WAIT_TIME: 10000 # how long we wait for a lock
LOCK_TEST_INTERVAL: 1000 # retry time
tryLock: (key, callback = (err, gotLock) ->) ->
existingLock = LockState[key]
if existingLock? # the lock is already taken, check how old it is
lockAge = Date.now() - existingLock.created
if lockAge < LockManager.MAX_LOCK_HOLD_TIME
return callback(null, false) # we didn't get the lock, bail out
else
logger.error {key: key, lock: existingLock, age:lockAge}, "taking old lock by force"
# take the lock
LockState[key] = lockValue = {created: Date.now()}
callback(null, true, lockValue)
getLock: (key, callback = (error, lockValue) ->) ->
startTime = Date.now()
do attempt = () ->
LockManager.tryLock key, (error, gotLock, lockValue) ->
return callback(error) if error?
if gotLock
callback(null, lockValue)
else if Date.now() - startTime > LockManager.MAX_LOCK_WAIT_TIME
e = new Error("Lock timeout")
e.key = key
return callback(e)
else
setTimeout attempt, LockManager.LOCK_TEST_INTERVAL
releaseLock: (key, lockValue, callback = (error) ->) ->
existingLock = LockState[key]
if existingLock is lockValue # lockValue is an object, so we can test by reference
delete LockState[key] # our lock, so we can free it
callback()
else if existingLock? # lock exists but doesn't match ours
logger.error {key:key, lock: existingLock}, "tried to release lock taken by force"
callback()
else
logger.error {key:key, lock: existingLock}, "tried to release lock that has gone"
callback()
runWithLock: (key, runner = ( (releaseLock = (error) ->) -> ), callback = ( (error) -> )) ->
LockManager.getLock key, (error, lockValue) ->
return callback(error) if error?
runner (error1, args...) ->
LockManager.releaseLock key, lockValue, (error2) ->
error = error1 or error2
return callback(error) if error?
callback(null, args...)

View File

@@ -0,0 +1,358 @@
Settings = require "settings-sharelatex"
logger = require "logger-sharelatex"
Docker = require("dockerode")
dockerode = new Docker()
crypto = require "crypto"
async = require "async"
LockManager = require "./DockerLockManager"
fs = require "fs"
Path = require 'path'
_ = require "underscore"
logger.info "using docker runner"
usingSiblingContainers = () ->
Settings?.path?.sandboxedCompilesHostDir?
module.exports = DockerRunner =
ERR_NOT_DIRECTORY: new Error("not a directory")
ERR_TERMINATED: new Error("terminated")
ERR_EXITED: new Error("exited")
ERR_TIMED_OUT: new Error("container timed out")
run: (project_id, command, directory, image, timeout, environment, callback = (error, output) ->) ->
if usingSiblingContainers()
_newPath = Settings.path.sandboxedCompilesHostDir
logger.log {path: _newPath}, "altering bind path for sibling containers"
# Server Pro, example:
# '/var/lib/sharelatex/data/compiles/<project-id>'
# ... becomes ...
# '/opt/sharelatex_data/data/compiles/<project-id>'
directory = Path.join(Settings.path.sandboxedCompilesHostDir, Path.basename(directory))
volumes = {}
volumes[directory] = "/compile"
command = (arg.toString().replace?('$COMPILE_DIR', "/compile") for arg in command)
if !image?
image = Settings.clsi.docker.image
if Settings.texliveImageNameOveride?
img = image.split("/")
image = "#{Settings.texliveImageNameOveride}/#{img[2]}"
options = DockerRunner._getContainerOptions(command, image, volumes, timeout, environment)
fingerprint = DockerRunner._fingerprintContainer(options)
options.name = name = "project-#{project_id}-#{fingerprint}"
# logOptions = _.clone(options)
# logOptions?.HostConfig?.SecurityOpt = "secomp used, removed in logging"
logger.log project_id: project_id, "running docker container"
DockerRunner._runAndWaitForContainer options, volumes, timeout, (error, output) ->
if error?.message?.match("HTTP code is 500")
logger.log err: error, project_id: project_id, "error running container so destroying and retrying"
DockerRunner.destroyContainer name, null, true, (error) ->
return callback(error) if error?
DockerRunner._runAndWaitForContainer options, volumes, timeout, callback
else
callback(error, output)
return name # pass back the container name to allow it to be killed
kill: (container_id, callback = (error) ->) ->
logger.log container_id: container_id, "sending kill signal to container"
container = dockerode.getContainer(container_id)
container.kill (error) ->
if error? and error?.message?.match?(/Cannot kill container .* is not running/)
logger.warn err: error, container_id: container_id, "container not running, continuing"
error = null
if error?
logger.error err: error, container_id: container_id, "error killing container"
return callback(error)
else
callback()
_runAndWaitForContainer: (options, volumes, timeout, _callback = (error, output) ->) ->
callback = (args...) ->
_callback(args...)
# Only call the callback once
_callback = () ->
name = options.name
streamEnded = false
containerReturned = false
output = {}
callbackIfFinished = () ->
if streamEnded and containerReturned
callback(null, output)
attachStreamHandler = (error, _output) ->
return callback(error) if error?
output = _output
streamEnded = true
callbackIfFinished()
DockerRunner.startContainer options, volumes, attachStreamHandler, (error, containerId) ->
return callback(error) if error?
DockerRunner.waitForContainer name, timeout, (error, exitCode) ->
return callback(error) if error?
if exitCode is 137 # exit status from kill -9
err = DockerRunner.ERR_TERMINATED
err.terminated = true
return callback(err)
if exitCode is 1 # exit status from chktex
err = DockerRunner.ERR_EXITED
err.code = exitCode
return callback(err)
containerReturned = true
options?.HostConfig?.SecurityOpt = null #small log line
logger.log err:err, exitCode:exitCode, options:options, "docker container has exited"
callbackIfFinished()
_getContainerOptions: (command, image, volumes, timeout, environment) ->
timeoutInSeconds = timeout / 1000
dockerVolumes = {}
for hostVol, dockerVol of volumes
dockerVolumes[dockerVol] = {}
if volumes[hostVol].slice(-3).indexOf(":r") == -1
volumes[hostVol] = "#{dockerVol}:rw"
# merge settings and environment parameter
env = {}
for src in [Settings.clsi.docker.env, environment or {}]
env[key] = value for key, value of src
# set the path based on the image year
if m = image.match /:([0-9]+)\.[0-9]+/
year = m[1]
else
year = "2014"
env['PATH'] = "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/texlive/#{year}/bin/x86_64-linux/"
options =
"Cmd" : command,
"Image" : image
"Volumes" : dockerVolumes
"WorkingDir" : "/compile"
"NetworkDisabled" : true
"Memory" : 1024 * 1024 * 1024 * 1024 # 1 Gb
"User" : Settings.clsi.docker.user
"Env" : ("#{key}=#{value}" for key, value of env) # convert the environment hash to an array
"HostConfig" :
"Binds": ("#{hostVol}:#{dockerVol}" for hostVol, dockerVol of volumes)
"LogConfig": {"Type": "none", "Config": {}}
"Ulimits": [{'Name': 'cpu', 'Soft': timeoutInSeconds+5, 'Hard': timeoutInSeconds+10}]
"CapDrop": "ALL"
"SecurityOpt": ["no-new-privileges"]
if Settings.path?.synctexBinHostPath?
options["HostConfig"]["Binds"].push("#{Settings.path.synctexBinHostPath}:/opt/synctex:ro")
if Settings.clsi.docker.seccomp_profile?
options.HostConfig.SecurityOpt.push "seccomp=#{Settings.clsi.docker.seccomp_profile}"
return options
_fingerprintContainer: (containerOptions) ->
# Yay, Hashing!
json = JSON.stringify(containerOptions)
return crypto.createHash("md5").update(json).digest("hex")
startContainer: (options, volumes, attachStreamHandler, callback) ->
LockManager.runWithLock options.name, (releaseLock) ->
# Check that volumes exist before starting the container.
# When a container is started with volume pointing to a
# non-existent directory then docker creates the directory but
# with root ownership.
DockerRunner._checkVolumes options, volumes, (err) ->
return releaseLock(err) if err?
DockerRunner._startContainer options, volumes, attachStreamHandler, releaseLock
, callback
# Check that volumes exist and are directories
_checkVolumes: (options, volumes, callback = (error, containerName) ->) ->
if usingSiblingContainers()
# Server Pro, with sibling-containers active, skip checks
return callback(null)
checkVolume = (path, cb) ->
fs.stat path, (err, stats) ->
return cb(err) if err?
return cb(DockerRunner.ERR_NOT_DIRECTORY) if not stats?.isDirectory()
cb()
jobs = []
for vol of volumes
do (vol) ->
jobs.push (cb) -> checkVolume(vol, cb)
async.series jobs, callback
_startContainer: (options, volumes, attachStreamHandler, callback = ((error, output) ->)) ->
callback = _.once(callback)
name = options.name
logger.log {container_name: name}, "starting container"
container = dockerode.getContainer(name)
createAndStartContainer = ->
dockerode.createContainer options, (error, container) ->
return callback(error) if error?
startExistingContainer()
startExistingContainer = ->
DockerRunner.attachToContainer options.name, attachStreamHandler, (error)->
return callback(error) if error?
container.start (error) ->
if error? and error?.statusCode != 304 #already running
return callback(error)
else
callback()
container.inspect (error, stats)->
if error?.statusCode == 404
createAndStartContainer()
else if error?
logger.err {container_name: name, error:error}, "unable to inspect container to start"
return callback(error)
else
startExistingContainer()
attachToContainer: (containerId, attachStreamHandler, attachStartCallback) ->
container = dockerode.getContainer(containerId)
container.attach {stdout: 1, stderr: 1, stream: 1}, (error, stream) ->
if error?
logger.error err: error, container_id: containerId, "error attaching to container"
return attachStartCallback(error)
else
attachStartCallback()
logger.log container_id: containerId, "attached to container"
MAX_OUTPUT = 1024 * 1024 # limit output to 1MB
createStringOutputStream = (name) ->
return {
data: ""
overflowed: false
write: (data) ->
return if @overflowed
if @data.length < MAX_OUTPUT
@data += data
else
logger.error container_id: containerId, length: @data.length, maxLen: MAX_OUTPUT, "#{name} exceeds max size"
@data += "(...truncated at #{MAX_OUTPUT} chars...)"
@overflowed = true
# kill container if too much output
# docker.containers.kill(containerId, () ->)
}
stdout = createStringOutputStream "stdout"
stderr = createStringOutputStream "stderr"
container.modem.demuxStream(stream, stdout, stderr)
stream.on "error", (err) ->
logger.error err: err, container_id: containerId, "error reading from container stream"
stream.on "end", () ->
attachStreamHandler null, {stdout: stdout.data, stderr: stderr.data}
waitForContainer: (containerId, timeout, _callback = (error, exitCode) ->) ->
callback = (args...) ->
_callback(args...)
# Only call the callback once
_callback = () ->
container = dockerode.getContainer(containerId)
timedOut = false
timeoutId = setTimeout () ->
timedOut = true
logger.log container_id: containerId, "timeout reached, killing container"
container.kill(() ->)
, timeout
logger.log container_id: containerId, "waiting for docker container"
container.wait (error, res) ->
if error?
clearTimeout timeoutId
logger.error err: error, container_id: containerId, "error waiting for container"
return callback(error)
if timedOut
logger.log containerId: containerId, "docker container timed out"
error = DockerRunner.ERR_TIMED_OUT
error.timedout = true
callback error
else
clearTimeout timeoutId
logger.log container_id: containerId, exitCode: res.StatusCode, "docker container returned"
callback null, res.StatusCode
destroyContainer: (containerName, containerId, shouldForce, callback = (error) ->) ->
# We want the containerName for the lock and, ideally, the
# containerId to delete. There is a bug in the docker.io module
# where if you delete by name and there is an error, it throws an
# async exception, but if you delete by id it just does a normal
# error callback. We fall back to deleting by name if no id is
# supplied.
LockManager.runWithLock containerName, (releaseLock) ->
DockerRunner._destroyContainer containerId or containerName, shouldForce, releaseLock
, callback
_destroyContainer: (containerId, shouldForce, callback = (error) ->) ->
logger.log container_id: containerId, "destroying docker container"
container = dockerode.getContainer(containerId)
container.remove {force: shouldForce == true}, (error) ->
if error? and error?.statusCode == 404
logger.warn err: error, container_id: containerId, "container not found, continuing"
error = null
if error?
logger.error err: error, container_id: containerId, "error destroying container"
else
logger.log container_id: containerId, "destroyed container"
callback(error)
# handle expiry of docker containers
MAX_CONTAINER_AGE: Settings.clsi.docker.maxContainerAge or oneHour = 60 * 60 * 1000
examineOldContainer: (container, callback = (error, name, id, ttl)->) ->
name = container.Name or container.Names?[0]
created = container.Created * 1000 # creation time is returned in seconds
now = Date.now()
age = now - created
maxAge = DockerRunner.MAX_CONTAINER_AGE
ttl = maxAge - age
logger.log {containerName: name, created: created, now: now, age: age, maxAge: maxAge, ttl: ttl}, "checking whether to destroy container"
callback(null, name, container.Id, ttl)
destroyOldContainers: (callback = (error) ->) ->
dockerode.listContainers all: true, (error, containers) ->
return callback(error) if error?
jobs = []
for container in containers or []
do (container) ->
DockerRunner.examineOldContainer container, (err, name, id, ttl) ->
if name.slice(0, 9) == '/project-' && ttl <= 0
jobs.push (cb) ->
DockerRunner.destroyContainer name, id, false, () -> cb()
# Ignore errors because some containers get stuck but
# will be destroyed next time
async.series jobs, callback
startContainerMonitor: () ->
logger.log {maxAge: DockerRunner.MAX_CONTAINER_AGE}, "starting container expiry"
# randomise the start time
randomDelay = Math.floor(Math.random() * 5 * 60 * 1000)
setTimeout () ->
setInterval () ->
DockerRunner.destroyOldContainers()
, oneHour = 60 * 60 * 1000
, randomDelay
DockerRunner.startContainerMonitor()

View File

@@ -0,0 +1,24 @@
fs = require "fs"
logger = require "logger-sharelatex"
module.exports = DraftModeManager =
injectDraftMode: (filename, callback = (error) ->) ->
fs.readFile filename, "utf8", (error, content) ->
return callback(error) if error?
# avoid adding draft mode more than once
if content?.indexOf("\\documentclass\[draft") >= 0
return callback()
modified_content = DraftModeManager._injectDraftOption content
logger.log {
content: content.slice(0,1024), # \documentclass is normally v near the top
modified_content: modified_content.slice(0,1024),
filename
}, "injected draft class"
fs.writeFile filename, modified_content, callback
_injectDraftOption: (content) ->
content
# With existing options (must be first, otherwise both are applied)
.replace(/\\documentclass\[/g, "\\documentclass[draft,")
# Without existing options
.replace(/\\documentclass\{/g, "\\documentclass[draft]{")

25
app/coffee/Errors.coffee Normal file
View File

@@ -0,0 +1,25 @@
NotFoundError = (message) ->
error = new Error(message)
error.name = "NotFoundError"
error.__proto__ = NotFoundError.prototype
return error
NotFoundError.prototype.__proto__ = Error.prototype
FilesOutOfSyncError = (message) ->
error = new Error(message)
error.name = "FilesOutOfSyncError"
error.__proto__ = FilesOutOfSyncError.prototype
return error
FilesOutOfSyncError.prototype.__proto__ = Error.prototype
AlreadyCompilingError = (message) ->
error = new Error(message)
error.name = "AlreadyCompilingError"
error.__proto__ = AlreadyCompilingError.prototype
return error
AlreadyCompilingError.prototype.__proto__ = Error.prototype
module.exports = Errors =
NotFoundError: NotFoundError
FilesOutOfSyncError: FilesOutOfSyncError
AlreadyCompilingError: AlreadyCompilingError

View File

@@ -2,56 +2,94 @@ Path = require "path"
Settings = require "settings-sharelatex"
logger = require "logger-sharelatex"
Metrics = require "./Metrics"
CommandRunner = require(Settings.clsi?.commandRunner or "./CommandRunner")
CommandRunner = require "./CommandRunner"
ProcessTable = {} # table of currently running jobs (pids or docker container names)
module.exports = LatexRunner =
runLatex: (project_id, options, callback = (error) ->) ->
{directory, mainFile, compiler, timeout} = options
{directory, mainFile, compiler, timeout, image, environment, flags} = options
compiler ||= "pdflatex"
timeout ||= 60000 # milliseconds
logger.log directory: directory, compiler: compiler, timeout: timeout, mainFile: mainFile, "starting compile"
logger.log directory: directory, compiler: compiler, timeout: timeout, mainFile: mainFile, environment: environment, flags:flags, "starting compile"
# We want to run latexmk on the tex file which we will automatically
# generate from the Rtex/Rmd/md file.
mainFile = mainFile.replace(/\.(Rtex|md|Rmd)$/, ".tex")
if compiler == "pdflatex"
command = LatexRunner._pdflatexCommand mainFile
command = LatexRunner._pdflatexCommand mainFile, flags
else if compiler == "latex"
command = LatexRunner._latexCommand mainFile
command = LatexRunner._latexCommand mainFile, flags
else if compiler == "xelatex"
command = LatexRunner._xelatexCommand mainFile
command = LatexRunner._xelatexCommand mainFile, flags
else if compiler == "lualatex"
command = LatexRunner._lualatexCommand mainFile
command = LatexRunner._lualatexCommand mainFile, flags
else
return callback new Error("unknown compiler: #{compiler}")
CommandRunner.run project_id, command, directory, timeout, callback
if Settings.clsi?.strace
command = ["strace", "-o", "strace", "-ff"].concat(command)
_latexmkBaseCommand: [ "latexmk", "-cd", "-f", "-jobname=output", "-auxdir=$COMPILE_DIR", "-outdir=$COMPILE_DIR"]
id = "#{project_id}" # record running project under this id
_pdflatexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-pdf", "-e", "$pdflatex='pdflatex -synctex=1 -interaction=batchmode %O %S'",
ProcessTable[id] = CommandRunner.run project_id, command, directory, image, timeout, environment, (error, output) ->
delete ProcessTable[id]
return callback(error) if error?
runs = output?.stderr?.match(/^Run number \d+ of .*latex/mg)?.length or 0
failed = if output?.stdout?.match(/^Latexmk: Errors/m)? then 1 else 0
# counters from latexmk output
stats = {}
stats["latexmk-errors"] = failed
stats["latex-runs"] = runs
stats["latex-runs-with-errors"] = if failed then runs else 0
stats["latex-runs-#{runs}"] = 1
stats["latex-runs-with-errors-#{runs}"] = if failed then 1 else 0
# timing information from /usr/bin/time
timings = {}
stderr = output?.stderr
timings["cpu-percent"] = stderr?.match(/Percent of CPU this job got: (\d+)/m)?[1] or 0
timings["cpu-time"] = stderr?.match(/User time.*: (\d+.\d+)/m)?[1] or 0
timings["sys-time"] = stderr?.match(/System time.*: (\d+.\d+)/m)?[1] or 0
callback error, output, stats, timings
killLatex: (project_id, callback = (error) ->) ->
id = "#{project_id}"
logger.log {id:id}, "killing running compile"
if not ProcessTable[id]?
logger.warn {id}, "no such project to kill"
return callback(null)
else
CommandRunner.kill ProcessTable[id], callback
_latexmkBaseCommand: (flags) ->
args = ["latexmk", "-cd", "-f", "-jobname=output", "-auxdir=$COMPILE_DIR", "-outdir=$COMPILE_DIR", "-synctex=1","-interaction=batchmode"]
if flags
args = args.concat(flags)
(Settings?.clsi?.latexmkCommandPrefix || []).concat(args)
_pdflatexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand(flags).concat [
"-pdf",
Path.join("$COMPILE_DIR", mainFile)
]
_latexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-pdfdvi", "-e", "$latex='latex -synctex=1 -interaction=batchmode %O %S'",
_latexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand(flags).concat [
"-pdfdvi",
Path.join("$COMPILE_DIR", mainFile)
]
_xelatexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-xelatex", "-e", "$pdflatex='xelatex -synctex=1 -interaction=batchmode %O %S'",
_xelatexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand(flags).concat [
"-xelatex",
Path.join("$COMPILE_DIR", mainFile)
]
_lualatexCommand: (mainFile) ->
LatexRunner._latexmkBaseCommand.concat [
"-pdf", "-e", "$pdflatex='lualatex -synctex=1 -interaction=batchmode %O %S'",
_lualatexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand(flags).concat [
"-lualatex",
Path.join("$COMPILE_DIR", mainFile)
]

View File

@@ -0,0 +1,48 @@
spawn = require("child_process").spawn
logger = require "logger-sharelatex"
logger.info "using standard command runner"
module.exports = CommandRunner =
run: (project_id, command, directory, image, timeout, environment, callback = (error) ->) ->
command = (arg.toString().replace('$COMPILE_DIR', directory) for arg in command)
logger.log project_id: project_id, command: command, directory: directory, "running command"
logger.warn "timeouts and sandboxing are not enabled with CommandRunner"
# merge environment settings
env = {}
env[key] = value for key, value of process.env
env[key] = value for key, value of environment
# run command as detached process so it has its own process group (which can be killed if needed)
proc = spawn command[0], command.slice(1), cwd: directory, env: env
stdout = ""
proc.stdout.on "data", (data)->
stdout += data
proc.on "error", (err)->
logger.err err:err, project_id:project_id, command: command, directory: directory, "error running command"
callback(err)
proc.on "close", (code, signal) ->
logger.info code:code, signal:signal, project_id:project_id, "command exited"
if signal is 'SIGTERM' # signal from kill method below
err = new Error("terminated")
err.terminated = true
return callback(err)
else if code is 1 # exit status from chktex
err = new Error("exited")
err.code = code
return callback(err)
else
callback(null, {"stdout": stdout})
return proc.pid # return process id to allow job to be killed if necessary
kill: (pid, callback = (error) ->) ->
try
process.kill -pid # kill all processes in group
catch err
return callback(err)
callback()

View File

@@ -0,0 +1,31 @@
Settings = require('settings-sharelatex')
logger = require "logger-sharelatex"
Lockfile = require('lockfile') # from https://github.com/npm/lockfile
Errors = require "./Errors"
fs = require("fs")
Path = require("path")
module.exports = LockManager =
LOCK_TEST_INTERVAL: 1000 # 50ms between each test of the lock
MAX_LOCK_WAIT_TIME: 15000 # 10s maximum time to spend trying to get the lock
LOCK_STALE: 5*60*1000 # 5 mins time until lock auto expires
runWithLock: (path, runner = ((releaseLock = (error) ->) ->), callback = ((error) ->)) ->
lockOpts =
wait: @MAX_LOCK_WAIT_TIME
pollPeriod: @LOCK_TEST_INTERVAL
stale: @LOCK_STALE
Lockfile.lock path, lockOpts, (error) ->
if error?.code is 'EEXIST'
return callback new Errors.AlreadyCompilingError("compile in progress")
else if error?
fs.lstat path, (statLockErr, statLock)->
fs.lstat Path.dirname(path), (statDirErr, statDir)->
fs.readdir Path.dirname(path), (readdirErr, readdirDir)->
logger.err error:error, path:path, statLock:statLock, statLockErr:statLockErr, statDir:statDir, statDirErr: statDirErr, readdirErr:readdirErr, readdirDir:readdirDir, "unable to get lock"
return callback(error)
else
runner (error1, args...) ->
Lockfile.unlock path, (error2) ->
error = error1 or error2
return callback(error) if error?
callback(null, args...)

View File

@@ -4,12 +4,17 @@ fse = require "fs-extra"
Path = require "path"
logger = require "logger-sharelatex"
_ = require "underscore"
Settings = require "settings-sharelatex"
crypto = require "crypto"
OutputFileOptimiser = require "./OutputFileOptimiser"
module.exports = OutputCacheManager =
CACHE_SUBDIR: '.cache/clsi'
BUILD_REGEX: /^[0-9a-f]+$/ # build id is Date.now() converted to hex
ARCHIVE_SUBDIR: '.archive/clsi'
# build id is HEXDATE-HEXRANDOM from Date.now()and RandomBytes
# for backwards compatibility, make the randombytes part optional
BUILD_REGEX: /^[0-9a-f]+(-[0-9a-f]+)?$/
CACHE_LIMIT: 2 # maximum number of cache directories
CACHE_AGE: 60*60*1000 # up to one hour old
@@ -21,40 +26,33 @@ module.exports = OutputCacheManager =
# for invalid build id, return top level
return file
generateBuildId: (callback = (error, buildId) ->) ->
# generate a secure build id from Date.now() and 8 random bytes in hex
crypto.randomBytes 8, (err, buf) ->
return callback(err) if err?
random = buf.toString('hex')
date = Date.now().toString(16)
callback err, "#{date}-#{random}"
saveOutputFiles: (outputFiles, compileDir, callback = (error) ->) ->
OutputCacheManager.generateBuildId (err, buildId) ->
return callback(err) if err?
OutputCacheManager.saveOutputFilesInBuildDir outputFiles, compileDir, buildId, callback
saveOutputFilesInBuildDir: (outputFiles, compileDir, buildId, callback = (error) ->) ->
# make a compileDir/CACHE_SUBDIR/build_id directory and
# copy all the output files into it
cacheRoot = Path.join(compileDir, OutputCacheManager.CACHE_SUBDIR)
# Put the files into a new cache subdirectory
buildId = Date.now().toString(16)
cacheDir = Path.join(compileDir, OutputCacheManager.CACHE_SUBDIR, buildId)
# let file expiry run in the background
OutputCacheManager.expireOutputFiles cacheRoot, {keep: buildId}
# Is it a per-user compile? check if compile directory is PROJECTID-USERID
perUser = Path.basename(compileDir).match(/^[0-9a-f]{24}-[0-9a-f]{24}$/)
checkFile = (src, callback) ->
# check if we have a valid file to copy into the cache
fs.stat src, (err, stats) ->
# Archive logs in background
if Settings.clsi?.archive_logs or Settings.clsi?.strace
OutputCacheManager.archiveLogs outputFiles, compileDir, buildId, (err) ->
if err?
# some problem reading the file
logger.error err: err, file: src, "stat error for file in cache"
callback(err)
else if not stats.isFile()
# other filetype - reject it
logger.error err: err, src: src, dst: dst, stat: stats, "nonfile output - refusing to copy to cache"
callback(new Error("output file is not a file"), file)
else
# it's a plain file, ok to copy
callback(null)
copyFile = (src, dst, callback) ->
# copy output file into the cache
fse.copy src, dst, (err) ->
if err?
logger.error err: err, src: src, dst: dst, "copy error for file in cache"
callback(err)
else
# call the optimiser for the file too
OutputFileOptimiser.optimiseFile src, dst, callback
logger.warn err:err, "erroring archiving log files"
# make the new cache directory
fse.ensureDir cacheDir, (err) ->
@@ -63,21 +61,57 @@ module.exports = OutputCacheManager =
callback(err, outputFiles)
else
# copy all the output files into the new cache directory
results = []
async.mapSeries outputFiles, (file, cb) ->
# don't send dot files as output, express doesn't serve them
if OutputCacheManager._fileIsHidden(file.path)
logger.warn compileDir: compileDir, path: file.path, "ignoring dotfile in output"
return cb()
# copy other files into cache directory if valid
newFile = _.clone(file)
[src, dst] = [Path.join(compileDir, file.path), Path.join(cacheDir, file.path)]
checkFile src, (err) ->
copyFile src, dst, (err) ->
if not err?
OutputCacheManager._checkFileIsSafe src, (err, isSafe) ->
return cb(err) if err?
if !isSafe
return cb()
OutputCacheManager._checkIfShouldCopy src, (err, shouldCopy) ->
return cb(err) if err?
if !shouldCopy
return cb()
OutputCacheManager._copyFile src, dst, (err) ->
return cb(err) if err?
newFile.build = buildId # attach a build id if we cached the file
cb(err, newFile)
, (err, results) ->
results.push newFile
cb()
, (err) ->
if err?
# pass back the original files if we encountered *any* error
callback(err, outputFiles)
# clean up the directory we just created
fse.remove cacheDir, (err) ->
if err?
logger.error err: err, dir: cacheDir, "error removing cache dir after failure"
else
# pass back the list of new files in the cache
callback(err, results)
# let file expiry run in the background, expire all previous files if per-user
OutputCacheManager.expireOutputFiles cacheRoot, {keep: buildId, limit: if perUser then 1 else null}
archiveLogs: (outputFiles, compileDir, buildId, callback = (error) ->) ->
archiveDir = Path.join(compileDir, OutputCacheManager.ARCHIVE_SUBDIR, buildId)
logger.log {dir: archiveDir}, "archiving log files for project"
fse.ensureDir archiveDir, (err) ->
return callback(err) if err?
async.mapSeries outputFiles, (file, cb) ->
[src, dst] = [Path.join(compileDir, file.path), Path.join(archiveDir, file.path)]
OutputCacheManager._checkFileIsSafe src, (err, isSafe) ->
return cb(err) if err?
return cb() if !isSafe
OutputCacheManager._checkIfShouldArchive src, (err, shouldArchive) ->
return cb(err) if err?
return cb() if !shouldArchive
OutputCacheManager._copyFile src, dst, cb
, callback
expireOutputFiles: (cacheRoot, options, callback = (error) ->) ->
# look in compileDir for build dirs and delete if > N or age of mod time > T
@@ -92,10 +126,13 @@ module.exports = OutputCacheManager =
isExpired = (dir, index) ->
return false if options?.keep == dir
# remove any directories over the requested (non-null) limit
return true if options?.limit? and index > options.limit
# remove any directories over the hard limit
return true if index > OutputCacheManager.CACHE_LIMIT
# we can get the build time from the directory name
dirTime = parseInt(dir, 16)
# we can get the build time from the first part of the directory name DDDD-RRRR
# DDDD is date and RRRR is random bytes
dirTime = parseInt(dir.split('-')?[0], 16)
age = currentTime - dirTime
return age > OutputCacheManager.CACHE_AGE
@@ -111,3 +148,52 @@ module.exports = OutputCacheManager =
async.eachSeries toRemove, (dir, cb) ->
removeDir dir, cb
, callback
_fileIsHidden: (path) ->
return path?.match(/^\.|\/./)?
_checkFileIsSafe: (src, callback = (error, isSafe) ->) ->
# check if we have a valid file to copy into the cache
fs.stat src, (err, stats) ->
if err?.code is 'ENOENT'
logger.warn err: err, file: src, "file has disappeared before copying to build cache"
callback(err, false)
else if err?
# some other problem reading the file
logger.error err: err, file: src, "stat error for file in cache"
callback(err, false)
else if not stats.isFile()
# other filetype - reject it
logger.warn src: src, stat: stats, "nonfile output - refusing to copy to cache"
callback(null, false)
else
# it's a plain file, ok to copy
callback(null, true)
_copyFile: (src, dst, callback) ->
# copy output file into the cache
fse.copy src, dst, (err) ->
if err?.code is 'ENOENT'
logger.warn err: err, file: src, "file has disappeared when copying to build cache"
callback(err, false)
else if err?
logger.error err: err, src: src, dst: dst, "copy error for file in cache"
callback(err)
else
if Settings.clsi?.optimiseInDocker
# don't run any optimisations on the pdf when they are done
# in the docker container
callback()
else
# call the optimiser for the file too
OutputFileOptimiser.optimiseFile src, dst, callback
_checkIfShouldCopy: (src, callback = (err, shouldCopy) ->) ->
return callback(null, !Path.basename(src).match(/^strace/))
_checkIfShouldArchive: (src, callback = (err, shouldCopy) ->) ->
if Path.basename(src).match(/^strace/)
return callback(null, true)
if Settings.clsi?.archive_logs and Path.basename(src) in ["output.log", "output.blg"]
return callback(null, true)
return callback(null, false)

View File

@@ -5,16 +5,15 @@ spawn = require("child_process").spawn
logger = require "logger-sharelatex"
module.exports = OutputFileFinder =
findOutputFiles: (resources, directory, callback = (error, outputFiles) ->) ->
findOutputFiles: (resources, directory, callback = (error, outputFiles, allFiles) ->) ->
incomingResources = {}
for resource in resources
incomingResources[resource.path] = true
logger.log directory: directory, "getting output files"
OutputFileFinder._getAllFiles directory, (error, allFiles = []) ->
return callback(error) if error?
jobs = []
if error?
logger.err err:error, "error finding all output files"
return callback(error)
outputFiles = []
for file in allFiles
if !incomingResources[file]
@@ -22,14 +21,16 @@ module.exports = OutputFileFinder =
path: file
type: file.match(/\.([^\.]+)$/)?[1]
}
callback null, outputFiles
callback null, outputFiles, allFiles
_getAllFiles: (directory, _callback = (error, fileList) ->) ->
callback = (error, fileList) ->
_callback(error, fileList)
_callback = () ->
args = [directory, "-name", ".cache", "-prune", "-o", "-type", "f", "-print"]
# don't include clsi-specific files/directories in the output list
EXCLUDE_DIRS = ["-name", ".cache", "-o", "-name", ".archive","-o", "-name", ".project-*"]
args = [directory, "(", EXCLUDE_DIRS..., ")", "-prune", "-o", "-type", "f", "-print"]
logger.log args: args, "running find command"
proc = spawn("find", args)

View File

@@ -2,6 +2,7 @@ fs = require "fs"
Path = require "path"
spawn = require("child_process").spawn
logger = require "logger-sharelatex"
Metrics = require "./Metrics"
_ = require "underscore"
module.exports = OutputFileOptimiser =
@@ -10,15 +11,31 @@ module.exports = OutputFileOptimiser =
# check output file (src) and see if we can optimise it, storing
# the result in the build directory (dst)
if src.match(/\/output\.pdf$/)
OutputFileOptimiser.optimisePDF src, dst, callback
OutputFileOptimiser.checkIfPDFIsOptimised src, (err, isOptimised) ->
return callback(null) if err? or isOptimised
OutputFileOptimiser.optimisePDF src, dst, callback
else
callback (null)
checkIfPDFIsOptimised: (file, callback) ->
SIZE = 16*1024 # check the header of the pdf
result = new Buffer(SIZE)
result.fill(0) # prevent leakage of uninitialised buffer
fs.open file, "r", (err, fd) ->
return callback(err) if err?
fs.read fd, result, 0, SIZE, 0, (errRead, bytesRead, buffer) ->
fs.close fd, (errClose) ->
return callback(errRead) if errRead?
return callback(errClose) if errReadClose?
isOptimised = buffer.toString('ascii').indexOf("/Linearized 1") >= 0
callback(null, isOptimised)
optimisePDF: (src, dst, callback = (error) ->) ->
tmpOutput = dst + '.opt'
args = ["--linearize", src, tmpOutput]
logger.log args: args, "running qpdf command"
timer = new Metrics.Timer("qpdf")
proc = spawn("qpdf", args)
stdout = ""
proc.stdout.on "data", (chunk) ->
@@ -28,6 +45,7 @@ module.exports = OutputFileOptimiser =
logger.warn {err, args}, "qpdf failed"
callback(null) # ignore the error
proc.on "close", (code) ->
timer.done()
if code != 0
logger.warn {code, args}, "qpdf returned error"
return callback(null) # ignore the error

View File

@@ -1,21 +1,28 @@
UrlCache = require "./UrlCache"
CompileManager = require "./CompileManager"
db = require "./db"
dbQueue = require "./DbQueue"
async = require "async"
logger = require "logger-sharelatex"
oneDay = 24 * 60 * 60 * 1000
Settings = require "settings-sharelatex"
module.exports = ProjectPersistenceManager =
EXPIRY_TIMEOUT: oneDay = 24 * 60 * 60 * 1000 #ms
EXPIRY_TIMEOUT: Settings.project_cache_length_ms || oneDay * 2.5
markProjectAsJustAccessed: (project_id, callback = (error) ->) ->
db.Project.findOrCreate(where: {project_id: project_id})
.spread(
(project, created) ->
project.updateAttributes(lastAccessed: new Date())
.then(() -> callback())
.error callback
)
.error callback
job = (cb)->
db.Project.findOrCreate(where: {project_id: project_id})
.spread(
(project, created) ->
project.updateAttributes(lastAccessed: new Date())
.then(() -> cb())
.error cb
)
.error cb
dbQueue.queue.push(job, callback)
clearExpiredProjects: (callback = (error) ->) ->
ProjectPersistenceManager._findExpiredProjectIds (error, project_ids) ->
@@ -24,29 +31,54 @@ module.exports = ProjectPersistenceManager =
jobs = for project_id in (project_ids or [])
do (project_id) ->
(callback) ->
ProjectPersistenceManager.clearProject project_id, (err) ->
ProjectPersistenceManager.clearProjectFromCache project_id, (err) ->
if err?
logger.error err: err, project_id: project_id, "error clearing project"
callback()
async.series jobs, callback
clearProject: (project_id, callback = (error) ->) ->
logger.log project_id: project_id, "clearing project"
CompileManager.clearProject project_id, (error) ->
return callback(error) if error?
UrlCache.clearProject project_id, (error) ->
async.series jobs, (error) ->
return callback(error) if error?
ProjectPersistenceManager._clearProjectFromDatabase project_id, (error) ->
return callback(error) if error?
callback()
CompileManager.clearExpiredProjects ProjectPersistenceManager.EXPIRY_TIMEOUT, (error) ->
callback() # ignore any errors from deleting directories
clearProject: (project_id, user_id, callback = (error) ->) ->
logger.log project_id: project_id, user_id:user_id, "clearing project for user"
CompileManager.clearProject project_id, user_id, (error) ->
return callback(error) if error?
ProjectPersistenceManager.clearProjectFromCache project_id, (error) ->
return callback(error) if error?
callback()
clearProjectFromCache: (project_id, callback = (error) ->) ->
logger.log project_id: project_id, "clearing project from cache"
UrlCache.clearProject project_id, (error) ->
if error?
logger.err error:error, project_id: project_id, "error clearing project from cache"
return callback(error)
ProjectPersistenceManager._clearProjectFromDatabase project_id, (error) ->
if error?
logger.err error:error, project_id:project_id, "error clearing project from database"
callback(error)
_clearProjectFromDatabase: (project_id, callback = (error) ->) ->
db.Project.destroy(where: {project_id: project_id})
.then(() -> callback())
.error callback
logger.log project_id:project_id, "clearing project from database"
job = (cb)->
db.Project.destroy(where: {project_id: project_id})
.then(() -> cb())
.error cb
dbQueue.queue.push(job, callback)
_findExpiredProjectIds: (callback = (error, project_ids) ->) ->
db.Project.findAll(where: ["lastAccessed < ?", new Date(Date.now() - ProjectPersistenceManager.EXPIRY_TIMEOUT)])
.then((projects) ->
callback null, projects.map((project) -> project.project_id)
).error callback
job = (cb)->
keepProjectsFrom = new Date(Date.now() - ProjectPersistenceManager.EXPIRY_TIMEOUT)
q = {}
q[db.op.lt] = keepProjectsFrom
db.Project.findAll(where:{lastAccessed:q})
.then((projects) ->
cb null, projects.map((project) -> project.project_id)
).error cb
dbQueue.queue.push(job, callback)
logger.log {EXPIRY_TIMEOUT: ProjectPersistenceManager.EXPIRY_TIMEOUT}, "project assets kept timeout"

View File

@@ -1,6 +1,8 @@
settings = require("settings-sharelatex")
module.exports = RequestParser =
VALID_COMPILERS: ["pdflatex", "latex", "xelatex", "lualatex"]
MAX_TIMEOUT: 300
MAX_TIMEOUT: 600
parse: (body, callback = (error, data) ->) ->
response = {}
@@ -21,6 +23,41 @@ module.exports = RequestParser =
compile.options.timeout
default: RequestParser.MAX_TIMEOUT
type: "number"
response.imageName = @_parseAttribute "imageName",
compile.options.imageName,
type: "string"
response.draft = @_parseAttribute "draft",
compile.options.draft,
default: false,
type: "boolean"
response.check = @_parseAttribute "check",
compile.options.check,
type: "string"
response.flags = @_parseAttribute "flags",
compile.options.flags,
default: [],
type: "object"
# The syncType specifies whether the request contains all
# resources (full) or only those resources to be updated
# in-place (incremental).
response.syncType = @_parseAttribute "syncType",
compile.options.syncType,
validValues: ["full", "incremental"]
type: "string"
# The syncState is an identifier passed in with the request
# which has the property that it changes when any resource is
# added, deleted, moved or renamed.
#
# on syncType full the syncState identifier is passed in and
# stored
#
# on syncType incremental the syncState identifier must match
# the stored value
response.syncState = @_parseAttribute "syncState",
compile.options.syncState,
type: "string"
if response.timeout > RequestParser.MAX_TIMEOUT
response.timeout = RequestParser.MAX_TIMEOUT
@@ -32,7 +69,13 @@ module.exports = RequestParser =
compile.rootResourcePath
default: "main.tex"
type: "string"
response.rootResourcePath = RequestParser._sanitizePath(rootResourcePath)
originalRootResourcePath = rootResourcePath
sanitizedRootResourcePath = RequestParser._sanitizePath(rootResourcePath)
response.rootResourcePath = RequestParser._checkPath(sanitizedRootResourcePath)
for resource in response.resources
if resource.path == originalRootResourcePath
resource.path = sanitizedRootResourcePath
catch error
return callback error
@@ -48,7 +91,7 @@ module.exports = RequestParser =
throw "resource modified date could not be understood: #{resource.modified}"
if !resource.url? and !resource.content?
throw "all resources should have either a url or content attribute"
throw "all resources should have either a url or content attribute"
if resource.content? and typeof resource.content != "string"
throw "content attribute should be a string"
if resource.url? and typeof resource.url != "string"
@@ -71,9 +114,15 @@ module.exports = RequestParser =
throw "#{name} attribute should be a #{options.type}"
else
return options.default if options.default?
throw "Default not implemented"
return attribute
_sanitizePath: (path) ->
# See http://php.net/manual/en/function.escapeshellcmd.php
path.replace(/[\#\&\;\`\|\*\?\~\<\>\^\(\)\[\]\{\}\$\\\x0A\xFF\x00]/g, "")
_checkPath: (path) ->
# check that the request does not use a relative path
for dir in path.split('/')
if dir == '..'
throw "relative path in root resource"
return path

View File

@@ -0,0 +1,72 @@
Path = require "path"
fs = require "fs"
logger = require "logger-sharelatex"
settings = require("settings-sharelatex")
Errors = require "./Errors"
SafeReader = require "./SafeReader"
module.exports = ResourceStateManager =
# The sync state is an identifier which must match for an
# incremental update to be allowed.
#
# The initial value is passed in and stored on a full
# compile, along with the list of resources..
#
# Subsequent incremental compiles must come with the same value - if
# not they will be rejected with a 409 Conflict response. The
# previous list of resources is returned.
#
# An incremental compile can only update existing files with new
# content. The sync state identifier must change if any docs or
# files are moved, added, deleted or renamed.
SYNC_STATE_FILE: ".project-sync-state"
SYNC_STATE_MAX_SIZE: 128*1024
saveProjectState: (state, resources, basePath, callback = (error) ->) ->
stateFile = Path.join(basePath, @SYNC_STATE_FILE)
if not state? # remove the file if no state passed in
logger.log state:state, basePath:basePath, "clearing sync state"
fs.unlink stateFile, (err) ->
if err? and err.code isnt 'ENOENT'
return callback(err)
else
return callback()
else
logger.log state:state, basePath:basePath, "writing sync state"
resourceList = (resource.path for resource in resources)
fs.writeFile stateFile, [resourceList..., "stateHash:#{state}"].join("\n"), callback
checkProjectStateMatches: (state, basePath, callback = (error, resources) ->) ->
stateFile = Path.join(basePath, @SYNC_STATE_FILE)
size = @SYNC_STATE_MAX_SIZE
SafeReader.readFile stateFile, size, 'utf8', (err, result, bytesRead) ->
return callback(err) if err?
if bytesRead is size
logger.error file:stateFile, size:size, bytesRead:bytesRead, "project state file truncated"
[resourceList..., oldState] = result?.toString()?.split("\n") or []
newState = "stateHash:#{state}"
logger.log state:state, oldState: oldState, basePath:basePath, stateMatches: (newState is oldState), "checking sync state"
if newState isnt oldState
return callback new Errors.FilesOutOfSyncError("invalid state for incremental update")
else
resources = ({path: path} for path in resourceList)
callback(null, resources)
checkResourceFiles: (resources, allFiles, basePath, callback = (error) ->) ->
# check the paths are all relative to current directory
for file in resources or []
for dir in file?.path?.split('/')
if dir == '..'
return callback new Error("relative path in resource file list")
# check if any of the input files are not present in list of files
seenFile = {}
for file in allFiles
seenFile[file] = true
missingFiles = (resource.path for resource in resources when not seenFile[resource.path])
if missingFiles?.length > 0
logger.err missingFiles:missingFiles, basePath:basePath, allFiles:allFiles, resources:resources, "missing input files for project"
return callback new Errors.FilesOutOfSyncError("resource files missing in incremental update")
else
callback()

View File

@@ -4,25 +4,71 @@ fs = require "fs"
async = require "async"
mkdirp = require "mkdirp"
OutputFileFinder = require "./OutputFileFinder"
ResourceStateManager = require "./ResourceStateManager"
Metrics = require "./Metrics"
logger = require "logger-sharelatex"
settings = require("settings-sharelatex")
parallelFileDownloads = settings.parallelFileDownloads or 1
module.exports = ResourceWriter =
syncResourcesToDisk: (project_id, resources, basePath, callback = (error) ->) ->
@_removeExtraneousFiles resources, basePath, (error) =>
syncResourcesToDisk: (request, basePath, callback = (error, resourceList) ->) ->
if request.syncType is "incremental"
logger.log project_id: request.project_id, user_id: request.user_id, "incremental sync"
ResourceStateManager.checkProjectStateMatches request.syncState, basePath, (error, resourceList) ->
return callback(error) if error?
ResourceWriter._removeExtraneousFiles resourceList, basePath, (error, outputFiles, allFiles) ->
return callback(error) if error?
ResourceStateManager.checkResourceFiles resourceList, allFiles, basePath, (error) ->
return callback(error) if error?
ResourceWriter.saveIncrementalResourcesToDisk request.project_id, request.resources, basePath, (error) ->
return callback(error) if error?
callback(null, resourceList)
else
logger.log project_id: request.project_id, user_id: request.user_id, "full sync"
@saveAllResourcesToDisk request.project_id, request.resources, basePath, (error) ->
return callback(error) if error?
ResourceStateManager.saveProjectState request.syncState, request.resources, basePath, (error) ->
return callback(error) if error?
callback(null, request.resources)
saveIncrementalResourcesToDisk: (project_id, resources, basePath, callback = (error) ->) ->
@_createDirectory basePath, (error) =>
return callback(error) if error?
jobs = for resource in resources
do (resource) =>
(callback) => @_writeResourceToDisk(project_id, resource, basePath, callback)
async.series jobs, callback
async.parallelLimit jobs, parallelFileDownloads, callback
_removeExtraneousFiles: (resources, basePath, _callback = (error) ->) ->
saveAllResourcesToDisk: (project_id, resources, basePath, callback = (error) ->) ->
@_createDirectory basePath, (error) =>
return callback(error) if error?
@_removeExtraneousFiles resources, basePath, (error) =>
return callback(error) if error?
jobs = for resource in resources
do (resource) =>
(callback) => @_writeResourceToDisk(project_id, resource, basePath, callback)
async.parallelLimit jobs, parallelFileDownloads, callback
_createDirectory: (basePath, callback = (error) ->) ->
fs.mkdir basePath, (err) ->
if err?
if err.code is 'EEXIST'
return callback()
else
logger.log {err: err, dir:basePath}, "error creating directory"
return callback(err)
else
return callback()
_removeExtraneousFiles: (resources, basePath, _callback = (error, outputFiles, allFiles) ->) ->
timer = new Metrics.Timer("unlink-output-files")
callback = (error) ->
callback = (error, result...) ->
timer.done()
_callback(error)
_callback(error, result...)
OutputFileFinder.findOutputFiles resources, basePath, (error, outputFiles) ->
OutputFileFinder.findOutputFiles resources, basePath, (error, outputFiles, allFiles) ->
return callback(error) if error?
jobs = []
@@ -30,37 +76,67 @@ module.exports = ResourceWriter =
do (file) ->
path = file.path
should_delete = true
if path.match(/^output\./) or path.match(/\.aux$/)
if path.match(/^output\./) or path.match(/\.aux$/) or path.match(/^cache\//) # knitr cache
should_delete = false
if path == "output.pdf" or path == "output.dvi" or path == "output.log"
if path.match(/^output-.*/) # Tikz cached figures (default case)
should_delete = false
if path.match(/\.(pdf|dpth|md5)$/) # Tikz cached figures (by extension)
should_delete = false
if path.match(/\.(pygtex|pygstyle)$/) or path.match(/(^|\/)_minted-[^\/]+\//) # minted files/directory
should_delete = false
if path.match(/\.md\.tex$/) or path.match(/(^|\/)_markdown_[^\/]+\//) # markdown files/directory
should_delete = false
if path.match(/-eps-converted-to\.pdf$/) # Epstopdf generated files
should_delete = false
if path == "output.pdf" or path == "output.dvi" or path == "output.log" or path == "output.xdv"
should_delete = true
if path == "output.tex" # created by TikzManager if present in output files
should_delete = true
if should_delete
jobs.push (callback) -> ResourceWriter._deleteFileIfNotDirectory Path.join(basePath, path), callback
async.series jobs, callback
async.series jobs, (error) ->
return callback(error) if error?
callback(null, outputFiles, allFiles)
_deleteFileIfNotDirectory: (path, callback = (error) ->) ->
fs.stat path, (error, stat) ->
return callback(error) if error?
if stat.isFile()
fs.unlink path, callback
if error? and error.code is 'ENOENT'
return callback()
else if error?
logger.err {err: error, path: path}, "error stating file in deleteFileIfNotDirectory"
return callback(error)
else if stat.isFile()
fs.unlink path, (error) ->
if error?
logger.err {err: error, path: path}, "error removing file in deleteFileIfNotDirectory"
callback(error)
else
callback()
else
callback()
_writeResourceToDisk: (project_id, resource, basePath, callback = (error) ->) ->
path = Path.normalize(Path.join(basePath, resource.path))
if (path.slice(0, basePath.length) != basePath)
return callback new Error("resource path is outside root directory")
mkdirp Path.dirname(path), (error) ->
ResourceWriter.checkPath basePath, resource.path, (error, path) ->
return callback(error) if error?
# TODO: Don't overwrite file if it hasn't been modified
if resource.url?
UrlCache.downloadUrlToFile project_id, resource.url, path, resource.modified, (err)->
if err?
logger.err err:err, project_id:project_id, path:path, resource_url:resource.url, modified:resource.modified, "error downloading file for resources"
callback() #try and continue compiling even if http resource can not be downloaded at this time
else
fs.writeFile path, resource.content, callback
mkdirp Path.dirname(path), (error) ->
return callback(error) if error?
# TODO: Don't overwrite file if it hasn't been modified
if resource.url?
UrlCache.downloadUrlToFile project_id, resource.url, path, resource.modified, (err)->
if err?
logger.err err:err, project_id:project_id, path:path, resource_url:resource.url, modified:resource.modified, "error downloading file for resources"
callback() #try and continue compiling even if http resource can not be downloaded at this time
else
process = require("process")
fs.writeFile path, resource.content, callback
try
result = fs.lstatSync(path)
catch e
checkPath: (basePath, resourcePath, callback) ->
path = Path.normalize(Path.join(basePath, resourcePath))
if (path.slice(0, basePath.length + 1) != basePath + "/")
return callback new Error("resource path is outside root directory")
else
return callback(null, path)

View File

@@ -0,0 +1,25 @@
fs = require "fs"
logger = require "logger-sharelatex"
module.exports = SafeReader =
# safely read up to size bytes from a file and return result as a
# string
readFile: (file, size, encoding, callback = (error, result) ->) ->
fs.open file, 'r', (err, fd) ->
return callback() if err? and err.code is 'ENOENT'
return callback(err) if err?
# safely return always closing the file
callbackWithClose = (err, result...) ->
fs.close fd, (err1) ->
return callback(err) if err?
return callback(err1) if err1?
callback(null, result...)
buff = new Buffer(size, 0) # fill with zeros
fs.read fd, buff, 0, buff.length, 0, (err, bytesRead, buffer) ->
return callbackWithClose(err) if err?
result = buffer.toString(encoding, 0, bytesRead)
callbackWithClose(null, result, bytesRead)

View File

@@ -29,10 +29,10 @@ module.exports = ForbidSymlinks = (staticFn, root, options) ->
# check that the requested path is not a symlink
fs.realpath requestedFsPath, (err, realFsPath)->
if err?
logger.warn err:err, requestedFsPath:requestedFsPath, realFsPath:realFsPath, path: req.params[0], project_id: req.params.project_id, "error checking file access"
if err.code == 'ENOENT'
return res.sendStatus(404)
else
logger.error err:err, requestedFsPath:requestedFsPath, realFsPath:realFsPath, path: req.params[0], project_id: req.params.project_id, "error checking file access"
return res.sendStatus(500)
else if requestedFsPath != realFsPath
logger.warn requestedFsPath:requestedFsPath, realFsPath:realFsPath, path: req.params[0], project_id: req.params.project_id, "trying to access a different file (symlink), aborting"

View File

@@ -0,0 +1,37 @@
fs = require "fs"
Path = require "path"
ResourceWriter = require "./ResourceWriter"
SafeReader = require "./SafeReader"
logger = require "logger-sharelatex"
# for \tikzexternalize or pstool to work the main file needs to match the
# jobname. Since we set the -jobname to output, we have to create a
# copy of the main file as 'output.tex'.
module.exports = TikzManager =
checkMainFile: (compileDir, mainFile, resources, callback = (error, needsMainFile) ->) ->
# if there's already an output.tex file, we don't want to touch it
for resource in resources
if resource.path is "output.tex"
logger.log compileDir: compileDir, mainFile: mainFile, "output.tex already in resources"
return callback(null, false)
# if there's no output.tex, see if we are using tikz/pgf or pstool in the main file
ResourceWriter.checkPath compileDir, mainFile, (error, path) ->
return callback(error) if error?
SafeReader.readFile path, 65536, "utf8", (error, content) ->
return callback(error) if error?
usesTikzExternalize = content?.indexOf("\\tikzexternalize") >= 0
usesPsTool = content?.indexOf("{pstool}") >= 0
logger.log compileDir: compileDir, mainFile: mainFile, usesTikzExternalize:usesTikzExternalize, usesPsTool: usesPsTool, "checked for packages needing main file as output.tex"
needsMainFile = (usesTikzExternalize || usesPsTool)
callback null, needsMainFile
injectOutputFile: (compileDir, mainFile, callback = (error) ->) ->
ResourceWriter.checkPath compileDir, mainFile, (error, path) ->
return callback(error) if error?
fs.readFile path, "utf8", (error, content) ->
return callback(error) if error?
logger.log compileDir: compileDir, mainFile: mainFile, "copied file to output.tex as project uses packages which require it"
# use wx flag to ensure that output file does not already exist
fs.writeFile Path.join(compileDir, "output.tex"), content, {flag:'wx'}, callback

View File

@@ -1,4 +1,5 @@
db = require("./db")
dbQueue = require "./DbQueue"
UrlFetcher = require("./UrlFetcher")
Settings = require("settings-sharelatex")
crypto = require("crypto")
@@ -51,7 +52,6 @@ module.exports = UrlCache =
_doesUrlNeedDownloading: (project_id, url, lastModified, callback = (error, needsDownloading) ->) ->
if !lastModified?
return callback null, true
UrlCache._findUrlDetails project_id, url, (error, urlDetails) ->
return callback(error) if error?
if !urlDetails? or !urlDetails.lastModified? or urlDetails.lastModified.getTime() < lastModified.getTime()
@@ -87,35 +87,48 @@ module.exports = UrlCache =
callback null
_deleteUrlCacheFromDisk: (project_id, url, callback = (error) ->) ->
fs.unlink UrlCache._cacheFilePathForUrl(project_id, url), callback
fs.unlink UrlCache._cacheFilePathForUrl(project_id, url), (error) ->
if error? and error.code != 'ENOENT' # no error if the file isn't present
return callback(error)
else
return callback()
_findUrlDetails: (project_id, url, callback = (error, urlDetails) ->) ->
db.UrlCache.find(where: { url: url, project_id: project_id })
.then((urlDetails) -> callback null, urlDetails)
.error callback
job = (cb)->
db.UrlCache.find(where: { url: url, project_id: project_id })
.then((urlDetails) -> cb null, urlDetails)
.error cb
dbQueue.queue.push job, callback
_updateOrCreateUrlDetails: (project_id, url, lastModified, callback = (error) ->) ->
db.UrlCache.findOrCreate(where: {url: url, project_id: project_id})
.spread(
(urlDetails, created) ->
urlDetails.updateAttributes(lastModified: lastModified)
.then(() -> callback())
.error(callback)
)
.error callback
job = (cb)->
db.UrlCache.findOrCreate(where: {url: url, project_id: project_id})
.spread(
(urlDetails, created) ->
urlDetails.updateAttributes(lastModified: lastModified)
.then(() -> cb())
.error(cb)
)
.error cb
dbQueue.queue.push(job, callback)
_clearUrlDetails: (project_id, url, callback = (error) ->) ->
db.UrlCache.destroy(where: {url: url, project_id: project_id})
.then(() -> callback null)
.error callback
job = (cb)->
db.UrlCache.destroy(where: {url: url, project_id: project_id})
.then(() -> cb null)
.error cb
dbQueue.queue.push(job, callback)
_findAllUrlsInProject: (project_id, callback = (error, urls) ->) ->
db.UrlCache.findAll(where: { project_id: project_id })
.then(
(urlEntries) ->
callback null, urlEntries.map((entry) -> entry.url)
)
.error callback
job = (cb)->
db.UrlCache.findAll(where: { project_id: project_id })
.then(
(urlEntries) ->
cb null, urlEntries.map((entry) -> entry.url)
)
.error cb
dbQueue.queue.push(job, callback)

View File

@@ -1,6 +1,8 @@
request = require("request").defaults(jar: false)
fs = require("fs")
logger = require "logger-sharelatex"
settings = require("settings-sharelatex")
URL = require('url');
oneMinute = 60 * 1000
@@ -11,6 +13,9 @@ module.exports = UrlFetcher =
_callback(error)
_callback = () ->
if settings.filestoreDomainOveride?
p = URL.parse(url).path
url = "#{settings.filestoreDomainOveride}#{p}"
timeoutHandler = setTimeout () ->
timeoutHandler = null
logger.error url:url, filePath: filePath, "Timed out downloading file to cache"

View File

@@ -1,9 +1,12 @@
Sequelize = require("sequelize")
Settings = require("settings-sharelatex")
_ = require("underscore")
logger = require "logger-sharelatex"
options = _.extend {logging:false}, Settings.mysql.clsi
logger.log dbPath:Settings.mysql.clsi.storage, "connecting to db"
sequelize = new Sequelize(
Settings.mysql.clsi.database,
Settings.mysql.clsi.username,
@@ -11,6 +14,12 @@ sequelize = new Sequelize(
options
)
if Settings.mysql.clsi.dialect == "sqlite"
logger.log "running PRAGMA journal_mode=WAL;"
sequelize.query("PRAGMA journal_mode=WAL;")
sequelize.query("PRAGMA synchronous=OFF;")
sequelize.query("PRAGMA read_uncommitted = true;")
module.exports =
UrlCache: sequelize.define("UrlCache", {
url: Sequelize.STRING
@@ -32,5 +41,15 @@ module.exports =
]
})
sync: () -> sequelize.sync()
op: Sequelize.Op
sync: () ->
logger.log dbPath:Settings.mysql.clsi.storage, "syncing db schema"
sequelize.sync()
.then(->
logger.log "db sync complete"
).catch((err)->
console.log err, "error syncing"
)

4
bin/acceptance_test Normal file
View File

@@ -0,0 +1,4 @@
#!/bin/bash
set -e;
MOCHA="node_modules/.bin/mocha --recursive --reporter spec --timeout 15000"
$MOCHA "$@"

BIN
bin/synctex Executable file

Binary file not shown.

9
buildscript.txt Normal file
View File

@@ -0,0 +1,9 @@
clsi
--language=coffeescript
--node-version=10.15.0
--acceptance-creds=None
--dependencies=mongo,redis
--docker-repos=gcr.io/overleaf-ops
--env-pass-through=TEXLIVE_IMAGE
--build-target=docker
--script-version=1.1.22

39
cloudbuild.yaml Normal file
View File

@@ -0,0 +1,39 @@
steps:
- id: texlive
name: 'gcr.io/overleaf-ops/texlive-full:2017.1'
- id: build
name: 'gcr.io/overleaf-ops/cloud-builder'
args:
- 'build'
env:
- 'BUILD_NUMBER=$SHORT_SHA'
- 'BRANCH_NAME=$BRANCH_NAME'
waitFor: ['-']
- id: test_unit
name: 'gcr.io/overleaf-ops/cloud-builder'
args:
- 'test_unit'
env:
- 'DOCKER_COMPOSE_FLAGS=-f docker-compose.ci.yml'
- 'BUILD_NUMBER=$SHORT_SHA'
- 'BRANCH_NAME=$BRANCH_NAME'
waitFor:
- build
- id: test_acceptance
name: 'gcr.io/overleaf-ops/cloud-builder'
args:
- 'test_acceptance'
env:
- 'DOCKER_COMPOSE_FLAGS=-f docker-compose.ci.yml'
- 'BUILD_NUMBER=$SHORT_SHA'
- 'BRANCH_NAME=$BRANCH_NAME'
- 'TEXLIVE_IMAGE=gcr.io/overleaf-ops/texlive-full:2017.1'
waitFor:
- build
- texlive
images:
- 'gcr.io/$PROJECT_ID/clsi:${BRANCH_NAME}-${SHORT_SHA}'
timeout: 1800s
options:
diskSizeGb: 200
machineType: 'N1_HIGHCPU_8'

View File

@@ -7,33 +7,65 @@ module.exports =
clsi:
database: "clsi"
username: "clsi"
password: null
dialect: "sqlite"
storage: Path.resolve(__dirname + "/../db.sqlite")
storage: process.env["SQLITE_PATH"] or Path.resolve(__dirname + "/../db.sqlite")
pool:
max: 1
min: 1
retry:
max: 10
compileSizeLimit: process.env["COMPILE_SIZE_LIMIT"] or "7mb"
path:
compilesDir: Path.resolve(__dirname + "/../compiles")
clsiCacheDir: Path.resolve(__dirname + "/../cache")
synctexBaseDir: (project_id) -> Path.join(@compilesDir, project_id)
# clsi:
# commandRunner: "docker-runner-sharelatex"
# docker:
# image: "quay.io/sharelatex/texlive-full"
# env:
# PATH: "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/texlive/2013/bin/x86_64-linux/"
# HOME: "/tmp"
# modem:
# socketPath: false
# user: "tex"
internal:
clsi:
port: 3013
host: "localhost"
host: process.env["LISTEN_ADDRESS"] or "localhost"
load_balancer_agent:
report_load:true
load_port: 3048
local_port: 3049
apis:
clsi:
url: "http://localhost:3013"
url: "http://#{process.env['CLSI_HOST'] or 'localhost'}:3013"
smokeTest: false
smokeTest: process.env["SMOKE_TEST"] or false
project_cache_length_ms: 1000 * 60 * 60 * 24
parallelFileDownloads: process.env["FILESTORE_PARALLEL_FILE_DOWNLOADS"] or 1
parallelSqlQueryLimit: process.env["FILESTORE_PARALLEL_SQL_QUERY_LIMIT"] or 1
filestoreDomainOveride: process.env["FILESTORE_DOMAIN_OVERRIDE"]
texliveImageNameOveride: process.env["TEX_LIVE_IMAGE_NAME_OVERRIDE"]
sentry:
dsn: process.env['SENTRY_DSN']
if process.env["DOCKER_RUNNER"]
module.exports.clsi =
dockerRunner: process.env["DOCKER_RUNNER"] == "true"
docker:
image: process.env["TEXLIVE_IMAGE"] or "quay.io/sharelatex/texlive-full:2017.1"
env:
HOME: "/tmp"
socketPath: "/var/run/docker.sock"
user: process.env["TEXLIVE_IMAGE_USER"] or "tex"
expireProjectAfterIdleMs: 24 * 60 * 60 * 1000
checkProjectsIntervalMs: 10 * 60 * 1000
try
seccomp_profile_path = Path.resolve(__dirname + "/../seccomp/clsi-profile.json")
module.exports.clsi.docker.seccomp_profile = JSON.stringify(JSON.parse(require("fs").readFileSync(seccomp_profile_path)))
catch error
console.log error, "could not load seccom profile from #{seccomp_profile_path}"
module.exports.path.synctexBaseDir = -> "/compile"
module.exports.path.sandboxedCompilesHostDir = process.env["COMPILES_HOST_DIR"]
module.exports.path.synctexBinHostPath = process.env["SYNCTEX_BIN_HOST_PATH"]

5
debug Executable file
View File

@@ -0,0 +1,5 @@
#!/bin/bash
echo "hello world"
sleep 3
echo "awake"
/opt/synctex pdf /compile/output.pdf 1 100 200

32
docker-compose-config.yml Normal file
View File

@@ -0,0 +1,32 @@
version: "2"
services:
dev:
environment:
TEXLIVE_IMAGE: quay.io/sharelatex/texlive-full:2017.1
TEXLIVE_IMAGE_USER: "tex"
SHARELATEX_CONFIG: /app/config/settings.defaults.coffee
DOCKER_RUNNER: "true"
COMPILES_HOST_DIR: $PWD/compiles
SYNCTEX_BIN_HOST_PATH: $PWD/bin/synctex
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./compiles:/app/compiles
- ./cache:/app/cache
- ./bin/synctex:/app/bin/synctex
ci:
environment:
TEXLIVE_IMAGE: quay.io/sharelatex/texlive-full:2017.1
TEXLIVE_IMAGE_USER: "tex"
SHARELATEX_CONFIG: /app/config/settings.defaults.coffee
DOCKER_RUNNER: "true"
COMPILES_HOST_DIR: $PWD/compiles
SYNCTEX_BIN_HOST_PATH: $PWD/bin/synctex
SQLITE_PATH: /app/compiles/db.sqlite
volumes:
- /var/run/docker.sock:/var/run/docker.sock:rw
- ./compiles:/app/compiles
- ./cache:/app/cache
- ./bin/synctex:/app/bin/synctex

49
docker-compose.ci.yml Normal file
View File

@@ -0,0 +1,49 @@
# This file was auto-generated, do not edit it directly.
# Instead run bin/update_build_scripts from
# https://github.com/sharelatex/sharelatex-dev-environment
# Version: 1.1.22
version: "2"
services:
test_unit:
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
command: npm run test:unit:_run
environment:
NODE_ENV: test
test_acceptance:
build: .
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
extends:
file: docker-compose-config.yml
service: ci
environment:
ELASTIC_SEARCH_DSN: es:9200
REDIS_HOST: redis
MONGO_HOST: mongo
POSTGRES_HOST: postgres
MOCHA_GREP: ${MOCHA_GREP}
NODE_ENV: test
TEXLIVE_IMAGE:
depends_on:
- mongo
- redis
command: npm run test:acceptance:_run
tar:
build: .
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
volumes:
- ./:/tmp/build/
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
user: root
redis:
image: redis
mongo:
image: mongo:3.4

56
docker-compose.yml Normal file
View File

@@ -0,0 +1,56 @@
# This file was auto-generated, do not edit it directly.
# Instead run bin/update_build_scripts from
# https://github.com/sharelatex/sharelatex-dev-environment
# Version: 1.1.22
version: "2"
services:
test_unit:
build: .
volumes:
- .:/app
working_dir: /app
environment:
MOCHA_GREP: ${MOCHA_GREP}
NODE_ENV: test
command: npm run test:unit
test_acceptance:
build: .
volumes:
- .:/app
working_dir: /app
extends:
file: docker-compose-config.yml
service: dev
environment:
ELASTIC_SEARCH_DSN: es:9200
REDIS_HOST: redis
MONGO_HOST: mongo
POSTGRES_HOST: postgres
MOCHA_GREP: ${MOCHA_GREP}
LOG_LEVEL: ERROR
NODE_ENV: test
depends_on:
- mongo
- redis
command: npm run test:acceptance
tar:
build: .
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
volumes:
- ./:/tmp/build/
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
user: root
redis:
image: redis
mongo:
image: mongo:3.4

6
entrypoint.sh Normal file
View File

@@ -0,0 +1,6 @@
#!/bin/bash
set -o pipefail
/app/inner-entrypoint.sh "$@" 2>&1 | ts

27
inner-entrypoint.sh Executable file
View File

@@ -0,0 +1,27 @@
#!/bin/sh
set -x
date
echo "Changing permissions of /var/run/docker.sock for sibling containers"
ls -al /var/run/docker.sock
docker --version
cat /etc/passwd
DOCKER_GROUP=$(stat -c '%g' /var/run/docker.sock)
groupadd --non-unique --gid ${DOCKER_GROUP} dockeronhost
usermod -aG dockeronhost node
mkdir -p /app/cache
chown -R node:node /app/cache
mkdir -p /app/compiles
chown -R node:node /app/compiles
chown -R node:node /app/bin/synctex
mkdir -p /app/test/acceptance/fixtures/tmp/
chown -R node:node /app
chown -R node:node /app/bin
exec runuser -u node -- "$@"

4
install_deps.sh Executable file
View File

@@ -0,0 +1,4 @@
/bin/sh
wget -qO- https://get.docker.com/ | sh
apt-get install poppler-utils vim ghostscript --yes
npm rebuild

41
kube.yaml Normal file
View File

@@ -0,0 +1,41 @@
apiVersion: v1
kind: Service
metadata:
name: clsi
namespace: default
spec:
type: LoadBalancer
ports:
- port: 80
protocol: TCP
targetPort: 80
selector:
run: clsi
---
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: clsi
namespace: default
spec:
replicas: 2
template:
metadata:
labels:
run: clsi
spec:
containers:
- name: clsi
image: gcr.io/henry-terraform-admin/clsi
imagePullPolicy: Always
readinessProbe:
httpGet:
path: status
port: 80
periodSeconds: 5
initialDelaySeconds: 0
failureThreshold: 3
successThreshold: 1

19
nodemon.json Normal file
View File

@@ -0,0 +1,19 @@
{
"ignore": [
".git",
"node_modules/"
],
"verbose": true,
"legacyWatch": true,
"execMap": {
"js": "npm run start"
},
"watch": [
"app/coffee/",
"app.coffee",
"config/"
],
"ext": "coffee"
}

2948
npm-shrinkwrap.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -6,41 +6,49 @@
"type": "git",
"url": "https://github.com/sharelatex/clsi-sharelatex.git"
},
"scripts": {
"compile:app": "([ -e app/coffee ] && coffee -m $COFFEE_OPTIONS -o app/js -c app/coffee || echo 'No CoffeeScript folder to compile') && ( [ -e app.coffee ] && coffee -m $COFFEE_OPTIONS -c app.coffee || echo 'No CoffeeScript app to compile')",
"start": "npm run compile:app && node $NODE_APP_OPTIONS app.js",
"test:acceptance:_run": "mocha --recursive --reporter spec --timeout 30000 --exit $@ test/acceptance/js",
"test:acceptance": "npm run compile:app && npm run compile:acceptance_tests && npm run test:acceptance:_run -- --grep=$MOCHA_GREP",
"test:unit:_run": "mocha --recursive --reporter spec --exit $@ test/unit/js",
"test:unit": "npm run compile:app && npm run compile:unit_tests && npm run test:unit:_run -- --grep=$MOCHA_GREP",
"compile:unit_tests": "[ ! -e test/unit/coffee ] && echo 'No unit tests to compile' || coffee -o test/unit/js -c test/unit/coffee",
"compile:acceptance_tests": "[ ! -e test/acceptance/coffee ] && echo 'No acceptance tests to compile' || coffee -o test/acceptance/js -c test/acceptance/coffee",
"compile:all": "npm run compile:app && npm run compile:unit_tests && npm run compile:acceptance_tests && npm run compile:smoke_tests",
"nodemon": "nodemon --config nodemon.json",
"compile:smoke_tests": "[ ! -e test/smoke/coffee ] && echo 'No smoke tests to compile' || coffee -o test/smoke/js -c test/smoke/coffee"
},
"author": "James Allen <james@sharelatex.com>",
"dependencies": {
"async": "0.2.9",
"body-parser": "^1.2.0",
"dockerode": "^2.5.3",
"express": "^4.2.0",
"fs-extra": "^0.16.3",
"heapdump": "^0.3.5",
"lockfile": "^1.0.3",
"logger-sharelatex": "^1.7.0",
"lynx": "0.0.11",
"metrics-sharelatex": "^2.2.0",
"mkdirp": "0.3.5",
"mysql": "2.6.2",
"request": "~2.21.0",
"logger-sharelatex": "git+https://github.com/sharelatex/logger-sharelatex.git#v1.0.0",
"settings-sharelatex": "git+https://github.com/sharelatex/settings-sharelatex.git#v1.0.0",
"metrics-sharelatex": "git+https://github.com/sharelatex/metrics-sharelatex.git#v1.3.0",
"sequelize": "^2.1.3",
"wrench": "~1.5.4",
"request": "^2.21.0",
"sequelize": "^4.38.0",
"settings-sharelatex": "git+https://github.com/sharelatex/settings-sharelatex.git#v1.1.0",
"smoke-test-sharelatex": "git+https://github.com/sharelatex/smoke-test-sharelatex.git#v0.2.0",
"sqlite3": "~2.2.0",
"express": "^4.2.0",
"body-parser": "^1.2.0",
"fs-extra": "^0.16.3",
"sqlite3": "^4.0.6",
"underscore": "^1.8.2",
"v8-profiler": "^5.2.4",
"heapdump": "^0.3.5"
"v8-profiler-node8": "^6.0.1",
"wrench": "~1.5.4"
},
"devDependencies": {
"mocha": "1.10.0",
"coffee-script": "1.6.0",
"chai": "~1.8.1",
"sinon": "~1.7.3",
"grunt": "~0.4.2",
"grunt-contrib-coffee": "~0.7.0",
"grunt-contrib-clean": "~0.5.0",
"grunt-shell": "~0.6.1",
"grunt-mocha-test": "~0.8.1",
"sandboxed-module": "~0.3.0",
"timekeeper": "0.0.4",
"grunt-execute": "^0.1.5",
"bunyan": "^0.22.1",
"grunt-bunyan": "^0.5.0"
"chai": "~1.8.1",
"coffeescript": "1.6.0",
"mocha": "^4.0.1",
"sandboxed-module": "~0.3.0",
"sinon": "~1.7.3",
"timekeeper": "0.0.4"
}
}

3
patch-texlive-dockerfile Normal file
View File

@@ -0,0 +1,3 @@
FROM quay.io/sharelatex/texlive-full:2017.1
# RUN usermod -u 1001 tex

836
seccomp/clsi-profile.json Normal file
View File

@@ -0,0 +1,836 @@
{
"defaultAction": "SCMP_ACT_ERRNO",
"architectures": [
"SCMP_ARCH_X86_64",
"SCMP_ARCH_X86",
"SCMP_ARCH_X32"
],
"syscalls": [
{
"name": "access",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "arch_prctl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "brk",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "chdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "chmod",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clock_getres",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clock_gettime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clock_nanosleep",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clone",
"action": "SCMP_ACT_ALLOW",
"args": [
{
"index": 0,
"value": 2080505856,
"valueTwo": 0,
"op": "SCMP_CMP_MASKED_EQ"
}
]
},
{
"name": "close",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "copy_file_range",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "creat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "dup",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "dup2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "dup3",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "execve",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "execveat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "exit",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "exit_group",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "faccessat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fadvise64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fadvise64_64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fallocate",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchmod",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchmodat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fcntl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fcntl64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fdatasync",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fork",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstatat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstatfs",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstatfs64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fsync",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "ftruncate",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "ftruncate64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "futex",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "futimesat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getcpu",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getcwd",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getdents",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getdents64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getegid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getegid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "geteuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "geteuid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgroups",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgroups32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpgrp",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getppid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpriority",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresgid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresuid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getrlimit",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "get_robust_list",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getrusage",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getsid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "gettid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getuid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "ioctl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "kill",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "_llseek",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "lseek",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "lstat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "lstat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "madvise",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mkdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mkdirat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mmap",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mmap2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mprotect",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mremap",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "munmap",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "newfstatat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "open",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "openat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pause",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pipe",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pipe2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "prctl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pread64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "preadv",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "prlimit64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pwrite64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pwritev",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "read",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "readlink",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "readlinkat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "readv",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rename",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "renameat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "renameat2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "restart_syscall",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rmdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigaction",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigpending",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigprocmask",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigqueueinfo",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigreturn",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigsuspend",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigtimedwait",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_tgsigqueueinfo",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_getaffinity",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_getparam",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_get_priority_max",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_get_priority_min",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_getscheduler",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_rr_get_interval",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_yield",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sendfile",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sendfile64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setgroups",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setgroups32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "set_robust_list",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "set_tid_address",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sigaltstack",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "stat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "stat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "statfs",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "statfs64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sync",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sync_file_range",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "syncfs",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sysinfo",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "tgkill",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_create",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_delete",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_getoverrun",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_gettime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_settime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "times",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "tkill",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "truncate",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "truncate64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "umask",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "uname",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "unlink",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "unlinkat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "utime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "utimensat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "utimes",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "vfork",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "vhangup",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "wait4",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "waitid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "write",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "writev",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pread",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "capget",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "capset",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchown",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "gettimeofday",
"action": "SCMP_ACT_ALLOW",
"args": []
}, {
"name": "epoll_pwait",
"action": "SCMP_ACT_ALLOW",
"args": []
}
]
}

34
synctex.profile Normal file
View File

@@ -0,0 +1,34 @@
include /etc/firejail/disable-common.inc
include /etc/firejail/disable-devel.inc
# include /etc/firejail/disable-mgmt.inc ## removed in 0.9.40
# include /etc/firejail/disable-secret.inc ## removed in 0.9.40
read-only /bin
blacklist /boot
blacklist /dev
read-only /etc
blacklist /home # blacklisted for synctex
read-only /lib
read-only /lib64
blacklist /media
blacklist /mnt
blacklist /opt
blacklist /root
read-only /run
blacklist /sbin
blacklist /selinux
blacklist /src
blacklist /sys
read-only /usr
caps.drop all
noroot
nogroups
net none
private-tmp
private-dev
shell none
seccomp
nonewprivs

View File

@@ -1,9 +1,10 @@
Client = require "./helpers/Client"
request = require "request"
require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Broken LaTeX file", ->
before ->
before (done)->
@broken_request =
resources: [
path: "main.tex"
@@ -24,6 +25,7 @@ describe "Broken LaTeX file", ->
\\end{document}
'''
]
ClsiApp.ensureRunning done
describe "on first run", ->
before (done) ->

View File

@@ -1,9 +1,10 @@
Client = require "./helpers/Client"
request = require "request"
require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Deleting Old Files", ->
before ->
before (done)->
@request =
resources: [
path: "main.tex"
@@ -14,6 +15,7 @@ describe "Deleting Old Files", ->
\\end{document}
'''
]
ClsiApp.ensureRunning done
describe "on first run", ->
before (done) ->

View File

@@ -3,27 +3,56 @@ request = require "request"
require("chai").should()
fs = require "fs"
ChildProcess = require "child_process"
fixturePath = (path) -> __dirname + "/../fixtures/" + path
ClsiApp = require "./helpers/ClsiApp"
logger = require("logger-sharelatex")
Path = require("path")
fixturePath = (path) -> Path.normalize(__dirname + "/../fixtures/" + path)
process = require "process"
console.log process.pid, process.ppid, process.getuid(),process.getgroups(), "PID"
try
console.log "creating tmp directory", fixturePath("tmp")
fs.mkdirSync(fixturePath("tmp"))
catch e
catch err
console.log err, fixturePath("tmp"), "unable to create fixture tmp path"
MOCHA_LATEX_TIMEOUT = 60 * 1000
convertToPng = (pdfPath, pngPath, callback = (error) ->) ->
convert = ChildProcess.exec "convert #{fixturePath(pdfPath)} #{fixturePath(pngPath)}"
command = "convert #{fixturePath(pdfPath)} #{fixturePath(pngPath)}"
console.log "COMMAND"
console.log command
convert = ChildProcess.exec command
stdout = ""
convert.stdout.on "data", (chunk) -> console.log "STDOUT", chunk.toString()
convert.stderr.on "data", (chunk) -> console.log "STDERR", chunk.toString()
convert.on "exit", () ->
callback()
compare = (originalPath, generatedPath, callback = (error, same) ->) ->
proc = ChildProcess.exec "compare -metric mae #{fixturePath(originalPath)} #{fixturePath(generatedPath)} #{fixturePath("tmp/diff.png")}"
diff_file = "#{fixturePath(generatedPath)}-diff.png"
proc = ChildProcess.exec "compare -metric mae #{fixturePath(originalPath)} #{fixturePath(generatedPath)} #{diff_file}"
stderr = ""
proc.stderr.on "data", (chunk) -> stderr += chunk
proc.on "exit", () ->
if stderr.trim() == "0 (0)"
# remove output diff if test matches expected image
fs.unlink diff_file, (err) ->
if err
throw err
callback null, true
else
console.log "compare result", stderr
callback null, false
checkPdfInfo = (pdfPath, callback = (error, output) ->) ->
proc = ChildProcess.exec "pdfinfo #{fixturePath(pdfPath)}"
stdout = ""
proc.stdout.on "data", (chunk) -> stdout += chunk
proc.stderr.on "data", (chunk) -> console.log "STDERR", chunk.toString()
proc.on "exit", () ->
if stdout.match(/Optimized:\s+yes/)
callback null, true
else
console.log stderr
callback null, false
compareMultiplePages = (project_id, callback = (error) ->) ->
@@ -39,44 +68,61 @@ compareMultiplePages = (project_id, callback = (error) ->) ->
compareNext page_no + 1, callback
compareNext 0, callback
comparePdf = (project_id, example_dir, callback = (error) ->) ->
console.log "CONVERT"
console.log "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png"
convertToPng "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png", (error) =>
throw error if error?
convertToPng "examples/#{example_dir}/output.pdf", "tmp/#{project_id}-source.png", (error) =>
throw error if error?
fs.stat fixturePath("tmp/#{project_id}-source-0.png"), (error, stat) =>
if error?
compare "tmp/#{project_id}-source.png", "tmp/#{project_id}-generated.png", (error, same) =>
throw error if error?
same.should.equal true
callback()
else
compareMultiplePages project_id, (error) ->
throw error if error?
callback()
downloadAndComparePdf = (project_id, example_dir, url, callback = (error) ->) ->
writeStream = fs.createWriteStream(fixturePath("tmp/#{project_id}.pdf"))
request.get(url).pipe(writeStream)
console.log("writing file out", fixturePath("tmp/#{project_id}.pdf"))
writeStream.on "close", () =>
convertToPng "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png", (error) =>
checkPdfInfo "tmp/#{project_id}.pdf", (error, optimised) =>
throw error if error?
convertToPng "examples/#{example_dir}/output.pdf", "tmp/#{project_id}-source.png", (error) =>
throw error if error?
fs.stat fixturePath("tmp/#{project_id}-source-0.png"), (error, stat) =>
if error?
compare "tmp/#{project_id}-source.png", "tmp/#{project_id}-generated.png", (error, same) =>
throw error if error?
same.should.equal true
callback()
else
compareMultiplePages project_id, (error) ->
throw error if error?
callback()
optimised.should.equal true
comparePdf project_id, example_dir, callback
Client.runServer(4242, fixturePath("examples"))
describe "Example Documents", ->
before (done) ->
ChildProcess.exec("rm test/acceptance/fixtures/tmp/*").on "exit", () -> done()
ChildProcess.exec("rm test/acceptance/fixtures/tmp/*").on "exit", () ->
ClsiApp.ensureRunning done
for example_dir in fs.readdirSync fixturePath("examples")
do (example_dir) ->
describe example_dir, ->
before ->
@project_id = Client.randomId()
@project_id = Client.randomId() + "_" + example_dir
it "should generate the correct pdf", (done) ->
this.timeout(MOCHA_LATEX_TIMEOUT)
Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) =>
if error || body?.compile?.status is "failure"
console.log "DEBUG: error", error, "body", JSON.stringify(body)
pdf = Client.getOutputFile body, "pdf"
downloadAndComparePdf(@project_id, example_dir, pdf.url, done)
it "should generate the correct pdf on the second run as well", (done) ->
this.timeout(MOCHA_LATEX_TIMEOUT)
Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) =>
if error || body?.compile?.status is "failure"
console.log "DEBUG: error", error, "body", JSON.stringify(body)
pdf = Client.getOutputFile body, "pdf"
downloadAndComparePdf(@project_id, example_dir, pdf.url, done)

View File

@@ -1,6 +1,7 @@
Client = require "./helpers/Client"
request = require "request"
require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Simple LaTeX file", ->
before (done) ->
@@ -15,7 +16,8 @@ describe "Simple LaTeX file", ->
\\end{document}
'''
]
Client.compile @project_id, @request, (@error, @res, @body) => done()
ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
it "should return the PDF", ->
pdf = Client.getOutputFile(@body, "pdf")

View File

@@ -2,21 +2,25 @@ Client = require "./helpers/Client"
request = require "request"
require("chai").should()
expect = require("chai").expect
ClsiApp = require "./helpers/ClsiApp"
crypto = require("crypto")
describe "Syncing", ->
before (done) ->
@request =
resources: [
path: "main.tex"
content: '''
content = '''
\\documentclass{article}
\\begin{document}
Hello world
\\end{document}
'''
@request =
resources: [
path: "main.tex"
content: content
]
@project_id = Client.randomId()
Client.compile @project_id, @request, (@error, @res, @body) => done()
ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
describe "from code to pdf", ->
it "should return the correct location", (done) ->
@@ -29,7 +33,7 @@ describe "Syncing", ->
describe "from pdf to code", ->
it "should return the correct location", (done) ->
Client.syncFromPdf @project_id, 1, 100, 200, (error, codePositions) ->
Client.syncFromPdf @project_id, 1, 100, 200, (error, codePositions) =>
throw error if error?
expect(codePositions).to.deep.equal(
code: [ { file: 'main.tex', line: 3, column: -1 } ]

View File

@@ -1,23 +1,27 @@
Client = require "./helpers/Client"
request = require "request"
require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Timed out compile", ->
before (done) ->
@request =
options:
timeout: 0.01 #seconds
timeout: 10 #seconds
resources: [
path: "main.tex"
content: '''
\\documentclass{article}
\\begin{document}
Hello world
\\def\\x{Hello!\\par\\x}
\\x
\\end{document}
'''
]
@project_id = Client.randomId()
Client.compile @project_id, @request, (@error, @res, @body) => done()
ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
it "should return a timeout error", ->
@body.compile.error.should.equal "container timed out"

View File

@@ -2,6 +2,7 @@ Client = require "./helpers/Client"
request = require "request"
require("chai").should()
sinon = require "sinon"
ClsiApp = require "./helpers/ClsiApp"
host = "localhost"
@@ -46,7 +47,8 @@ describe "Url Caching", ->
}]
sinon.spy Server, "getFile"
Client.compile @project_id, @request, (@error, @res, @body) => done()
ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
afterEach ->
Server.getFile.restore()

View File

@@ -4,6 +4,7 @@ require("chai").should()
expect = require("chai").expect
path = require("path")
fs = require("fs")
ClsiApp = require "./helpers/ClsiApp"
describe "Syncing", ->
before (done) ->
@@ -13,7 +14,8 @@ describe "Syncing", ->
content: fs.readFileSync(path.join(__dirname,"../fixtures/naugty_strings.txt"),"utf-8")
]
@project_id = Client.randomId()
Client.compile @project_id, @request, (@error, @res, @body) => done()
ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
describe "wordcount file", ->
it "should return wordcount info", (done) ->
@@ -29,6 +31,8 @@ describe "Syncing", ->
elements: 0
mathInline: 6
mathDisplay: 0
errors: 0
messages: ""
}
)
done()

View File

@@ -30,7 +30,11 @@ module.exports = Client =
express = require("express")
app = express()
app.use express.static(directory)
app.listen(port, host)
console.log("starting test server on", port, host)
app.listen(port, host).on "error", (error) ->
console.error "error starting server:", error.message
process.exit(1)
syncFromCode: (project_id, file, line, column, callback = (error, pdfPositions) ->) ->
request.get {

View File

@@ -0,0 +1,24 @@
app = require('../../../../app')
require("logger-sharelatex").logger.level("info")
logger = require("logger-sharelatex")
Settings = require("settings-sharelatex")
module.exports =
running: false
initing: false
callbacks: []
ensureRunning: (callback = (error) ->) ->
if @running
return callback()
else if @initing
@callbacks.push callback
else
@initing = true
@callbacks.push callback
app.listen Settings.internal?.clsi?.port, "localhost", (error) =>
throw error if error?
@running = true
logger.log("clsi running in dev mode")
for callback in @callbacks
callback()

View File

@@ -0,0 +1,12 @@
\documentclass{article}
\usepackage{fontawesome}
\begin{document}
Cloud \faCloud
Cog \faCog
Database \faDatabase
Leaf \faLeaf
\end{document}

View File

@@ -0,0 +1,16 @@
\documentclass{article}
\usepackage{fontspec}
\defaultfontfeatures{Extension = .otf} % this is needed because
% fontawesome package loads by
% font name only
\usepackage{fontawesome}
\begin{document}
Cloud \faCloud
Cog \faCog
Database \faDatabase
Leaf \faLeaf
\end{document}

View File

@@ -0,0 +1,3 @@
{
"compiler": "xelatex"
}

View File

@@ -0,0 +1,14 @@
\documentclass{article}
\usepackage[utf8x]{inputenc}
\usepackage[hebrew,english]{babel}
\begin{document}
\selectlanguage{hebrew}
כדי לכתוב משהו באנגלית חייבים להשתמש במקרו הבא וכאן
ממשיכים לכתוב בעברית. טקסט נוסחאות תמיד יהיה בכיוון שמאל-לימין
\selectlanguage{english}
This is a test.
\end{document}

Binary file not shown.

View File

@@ -0,0 +1,35 @@
\documentclass{article}
\usepackage[utf8]{inputenc}
\usepackage[spanish]{babel}
\begin{document}
\tableofcontents
\vspace{2cm} %Add a 2cm space
\begin{abstract}
Este es un breve resumen del contenido del
documento escrito en español.
\end{abstract}
\section{Sección Introductoria}
Esta es la primera sección, podemos agregar
algunos elementos adicionales y todo será
escrito correctamente. Más aún, si una palabra
es demaciado larga y tiene que ser truncada,
babel tratará de truncarla correctamente
dependiendo del idioma.
\section{Sección con teoremas}
Esta sección es para ver que pasa con los comandos
que definen texto
%% chunk options: cache this chunk
%% begin.rcode my-cache, cache=TRUE
% set.seed(123)
% x = runif(10)
% sd(x) # standard deviation
%% end.rcode
\end{document}

View File

@@ -1,4 +1,4 @@
\documentclass{article}
\documentclass[a4paper]{article}
\usepackage{graphicx}

View File

@@ -0,0 +1,66 @@
\RequirePackage{luatex85}
\documentclass[tikz]{standalone}
\usepackage[compat=1.1.0]{tikz-feynman}
\begin{document}
\feynmandiagram [horizontal=a to b] {
i1 -- [fermion] a -- [fermion] i2,
a -- [photon] b,
f1 -- [fermion] b -- [fermion] f2,
};
\feynmandiagram [horizontal=a to b] {
i1 [particle=\(e^{-}\)] -- [fermion] a -- [fermion] i2 [particle=\(e^{+}\)],
a -- [photon, edge label=\(\gamma\), momentum'=\(k\)] b,
f1 [particle=\(\mu^{+}\)] -- [fermion] b -- [fermion] f2 [particle=\(\mu^{-}\)],
};
\feynmandiagram [large, vertical=e to f] {
a -- [fermion] b -- [photon, momentum=\(k\)] c -- [fermion] d,
b -- [fermion, momentum'=\(p_{1}\)] e -- [fermion, momentum'=\(p_{2}\)] c,
e -- [gluon] f,
h -- [fermion] f -- [fermion] i,
};
\begin{tikzpicture}
\begin{feynman}
\vertex (a1) {\(\overline b\)};
\vertex[right=1cm of a1] (a2);
\vertex[right=1cm of a2] (a3);
\vertex[right=1cm of a3] (a4) {\(b\)};
\vertex[right=1cm of a4] (a5);
\vertex[right=2cm of a5] (a6) {\(u\)};
\vertex[below=2em of a1] (b1) {\(d\)};
\vertex[right=1cm of b1] (b2);
\vertex[right=1cm of b2] (b3);
\vertex[right=1cm of b3] (b4) {\(\overline d\)};
\vertex[below=2em of a6] (b5) {\(\overline d\)};
\vertex[above=of a6] (c1) {\(\overline u\)};
\vertex[above=2em of c1] (c3) {\(d\)};
\vertex at ($(c1)!0.5!(c3) - (1cm, 0)$) (c2);
\diagram* {
{[edges=fermion]
(b1) -- (b2) -- (a2) -- (a1),
(b5) -- (b4) -- (b3) -- (a3) -- (a4) -- (a5) -- (a6),
},
(a2) -- [boson, edge label=\(W\)] (a3),
(b2) -- [boson, edge label'=\(W\)] (b3),
(c1) -- [fermion, out=180, in=-45] (c2) -- [fermion, out=45, in=180] (c3),
(a5) -- [boson, bend left, edge label=\(W^{-}\)] (c2),
};
\draw [decoration={brace}, decorate] (b1.south west) -- (a1.north west)
node [pos=0.5, left] {\(B^{0}\)};
\draw [decoration={brace}, decorate] (c3.north east) -- (c1.south east)
node [pos=0.5, right] {\(\pi^{-}\)};
\draw [decoration={brace}, decorate] (a6.north east) -- (b5.south east)
node [pos=0.5, right] {\(\pi^{+}\)};
\end{feynman}
\end{tikzpicture}
\end{document}

View File

@@ -0,0 +1,3 @@
{
"compiler": "lualatex"
}

View File

@@ -0,0 +1,23 @@
#!/bin/bash -x
export SHARELATEX_CONFIG=`pwd`/test/acceptance/scripts/settings.test.coffee
echo ">> Starting server..."
grunt --no-color >server.log 2>&1 &
echo ">> Server started"
sleep 5
echo ">> Running acceptance tests..."
grunt --no-color mochaTest:acceptance
_test_exit_code=$?
echo ">> Killing server"
kill %1
echo ">> Done"
exit $_test_exit_code

View File

@@ -0,0 +1,47 @@
Path = require "path"
module.exports =
# Options are passed to Sequelize.
# See http://sequelizejs.com/documentation#usage-options for details
mysql:
clsi:
database: "clsi"
username: "clsi"
password: null
dialect: "sqlite"
storage: Path.resolve("db.sqlite")
path:
compilesDir: Path.resolve(__dirname + "/../../../compiles")
clsiCacheDir: Path.resolve(__dirname + "/../../../cache")
#synctexBaseDir: (project_id) -> Path.join(@compilesDir, project_id)
synctexBaseDir: () -> "/compile"
sandboxedCompilesHostDir: process.env['SANDBOXED_COMPILES_HOST_DIR']
clsi:
#strace: true
#archive_logs: true
commandRunner: "docker-runner-sharelatex"
latexmkCommandPrefix: ["/usr/bin/time", "-v"] # on Linux
docker:
image: process.env.TEXLIVE_IMAGE || "texlive-full:2017.1-opt"
env:
PATH: "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/texlive/2017/bin/x86_64-linux/"
HOME: "/tmp"
modem:
socketPath: false
user: process.env.SIBLING_CONTAINER_USER ||"111"
internal:
clsi:
port: 3013
load_port: 3044
host: "localhost"
apis:
clsi:
url: "http://localhost:3013"
smokeTest: false
project_cache_length_ms: 1000 * 60 * 60 * 24
parallelFileDownloads:1

View File

@@ -6,19 +6,48 @@ Settings = require "settings-sharelatex"
buildUrl = (path) -> "http://#{Settings.internal.clsi.host}:#{Settings.internal.clsi.port}/#{path}"
url = buildUrl("project/smoketest-#{process.pid}/compile")
describe "Running a compile", ->
before (done) ->
request.post {
url: buildUrl("project/smoketest/compile")
url: url
json:
compile:
resources: [
path: "main.tex"
content: """
\\documentclass{article}
\\begin{document}
Hello world
\\end{document}
% Membrane-like surface
% Author: Yotam Avital
\\documentclass{article}
\\usepackage{tikz}
\\usetikzlibrary{calc,fadings,decorations.pathreplacing}
\\begin{document}
\\begin{tikzpicture}
\\def\\nuPi{3.1459265}
\\foreach \\i in {5,4,...,2}{% This one doesn't matter
\\foreach \\j in {3,2,...,0}{% This will crate a membrane
% with the front lipids visible
% top layer
\\pgfmathsetmacro{\\dx}{rand*0.1}% A random variance in the x coordinate
\\pgfmathsetmacro{\\dy}{rand*0.1}% A random variance in the y coordinate,
% gives a hight fill to the lipid
\\pgfmathsetmacro{\\rot}{rand*0.1}% A random variance in the
% molecule orientation
\\shade[ball color=red] ({\\i+\\dx+\\rot},{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)}) circle(0.45);
\\shade[ball color=gray] (\\i+\\dx,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-0.9}) circle(0.45);
\\shade[ball color=gray] (\\i+\\dx-\\rot,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-1.8}) circle(0.45);
% bottom layer
\\pgfmathsetmacro{\\dx}{rand*0.1}
\\pgfmathsetmacro{\\dy}{rand*0.1}
\\pgfmathsetmacro{\\rot}{rand*0.1}
\\shade[ball color=gray] (\\i+\\dx+\\rot,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-2.8}) circle(0.45);
\\shade[ball color=gray] (\\i+\\dx,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-3.7}) circle(0.45);
\\shade[ball color=red] (\\i+\\dx-\\rot,{0.5*\\j+\\dy+0.4*sin(\\i*\\nuPi*10)-4.6}) circle(0.45);
}
}
\\end{tikzpicture}
\\end{document}
"""
]
}, (@error, @response, @body) =>

View File

@@ -14,7 +14,7 @@ describe "CompileController", ->
clsi:
url: "http://clsi.example.com"
"./ProjectPersistenceManager": @ProjectPersistenceManager = {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() }
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub(), err:sinon.stub(), warn: sinon.stub()}
@Settings.externalUrl = "http://www.example.com"
@req = {}
@res = {}
@@ -49,7 +49,7 @@ describe "CompileController", ->
describe "successfully", ->
beforeEach ->
@CompileManager.doCompile = sinon.stub().callsArgWith(1, null, @output_files)
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, null, @output_files)
@CompileController.compile @req, @res
it "should parse the request", ->
@@ -58,7 +58,7 @@ describe "CompileController", ->
.should.equal true
it "should run the compile for the specified project", ->
@CompileManager.doCompile
@CompileManager.doCompileWithLock
.calledWith(@request_with_project_id)
.should.equal true
@@ -75,7 +75,8 @@ describe "CompileController", ->
status: "success"
error: null
outputFiles: @output_files.map (file) =>
url: "#{@Settings.apis.clsi.url}/project/#{@project_id}/output/#{file.path}"
url: "#{@Settings.apis.clsi.url}/project/#{@project_id}/build/#{file.build}/output/#{file.path}"
path: file.path
type: file.type
build: file.build
)
@@ -83,7 +84,7 @@ describe "CompileController", ->
describe "with an error", ->
beforeEach ->
@CompileManager.doCompile = sinon.stub().callsArgWith(1, new Error(@message = "error message"), null)
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, new Error(@message = "error message"), null)
@CompileController.compile @req, @res
it "should return the JSON response with the error", ->
@@ -101,7 +102,7 @@ describe "CompileController", ->
beforeEach ->
@error = new Error(@message = "container timed out")
@error.timedout = true
@CompileManager.doCompile = sinon.stub().callsArgWith(1, @error, null)
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, @error, null)
@CompileController.compile @req, @res
it "should return the JSON response with the timeout status", ->
@@ -117,7 +118,7 @@ describe "CompileController", ->
describe "when the request returns no output files", ->
beforeEach ->
@CompileManager.doCompile = sinon.stub().callsArgWith(1, null, [])
@CompileManager.doCompileWithLock = sinon.stub().callsArgWith(1, null, [])
@CompileController.compile @req, @res
it "should return the JSON response with the failure status", ->
@@ -143,19 +144,19 @@ describe "CompileController", ->
file: @file
line: @line.toString()
column: @column.toString()
@res.send = sinon.stub()
@res.json = sinon.stub()
@CompileManager.syncFromCode = sinon.stub().callsArgWith(4, null, @pdfPositions = ["mock-positions"])
@CompileManager.syncFromCode = sinon.stub().callsArgWith(5, null, @pdfPositions = ["mock-positions"])
@CompileController.syncFromCode @req, @res, @next
it "should find the corresponding location in the PDF", ->
@CompileManager.syncFromCode
.calledWith(@project_id, @file, @line, @column)
.calledWith(@project_id, undefined, @file, @line, @column)
.should.equal true
it "should return the positions", ->
@res.send
.calledWith(JSON.stringify
@res.json
.calledWith(
pdf: @pdfPositions
)
.should.equal true
@@ -172,19 +173,19 @@ describe "CompileController", ->
page: @page.toString()
h: @h.toString()
v: @v.toString()
@res.send = sinon.stub()
@res.json = sinon.stub()
@CompileManager.syncFromPdf = sinon.stub().callsArgWith(4, null, @codePositions = ["mock-positions"])
@CompileManager.syncFromPdf = sinon.stub().callsArgWith(5, null, @codePositions = ["mock-positions"])
@CompileController.syncFromPdf @req, @res, @next
it "should find the corresponding location in the code", ->
@CompileManager.syncFromPdf
.calledWith(@project_id, @page, @h, @v)
.calledWith(@project_id, undefined, @page, @h, @v)
.should.equal true
it "should return the positions", ->
@res.send
.calledWith(JSON.stringify
@res.json
.calledWith(
code: @codePositions
)
.should.equal true
@@ -197,19 +198,20 @@ describe "CompileController", ->
project_id: @project_id
@req.query =
file: @file
@res.send = sinon.stub()
image: @image = "example.com/image"
@res.json = sinon.stub()
@CompileManager.wordcount = sinon.stub().callsArgWith(2, null, @texcount = ["mock-texcount"])
@CompileManager.wordcount = sinon.stub().callsArgWith(4, null, @texcount = ["mock-texcount"])
@CompileController.wordcount @req, @res, @next
it "should return the word count of a file", ->
@CompileManager.wordcount
.calledWith(@project_id, @file)
.calledWith(@project_id, undefined, @file, @image)
.should.equal true
it "should return the texcount info", ->
@res.send
.calledWith(JSON.stringify
@res.json
.calledWith(
texcount: @texcount
)
.should.equal true

View File

@@ -13,12 +13,74 @@ describe "CompileManager", ->
"./ResourceWriter": @ResourceWriter = {}
"./OutputFileFinder": @OutputFileFinder = {}
"./OutputCacheManager": @OutputCacheManager = {}
"settings-sharelatex": @Settings = { path: compilesDir: "/compiles/dir" }
"logger-sharelatex": @logger = { log: sinon.stub() }
"settings-sharelatex": @Settings =
path:
compilesDir: "/compiles/dir"
synctexBaseDir: -> "/compile"
clsi:
docker:
image: "SOMEIMAGE"
"logger-sharelatex": @logger = { log: sinon.stub() , info:->}
"child_process": @child_process = {}
"./CommandRunner": @CommandRunner = {}
"./DraftModeManager": @DraftModeManager = {}
"./TikzManager": @TikzManager = {}
"./LockManager": @LockManager = {}
"fs": @fs = {}
"fs-extra": @fse = { ensureDir: sinon.stub().callsArg(1) }
@callback = sinon.stub()
@project_id = "project-id-123"
@user_id = "1234"
describe "doCompileWithLock", ->
beforeEach ->
@request =
resources: @resources = "mock-resources"
project_id: @project_id
user_id: @user_id
@output_files = ["foo", "bar"]
@Settings.compileDir = "compiles"
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@CompileManager.doCompile = sinon.stub().callsArgWith(1, null, @output_files)
@LockManager.runWithLock = (lockFile, runner, callback) ->
runner (err, result...) ->
callback(err, result...)
describe "when the project is not locked", ->
beforeEach ->
@CompileManager.doCompileWithLock @request, @callback
it "should ensure that the compile directory exists", ->
@fse.ensureDir.calledWith(@compileDir)
.should.equal true
it "should call doCompile with the request", ->
@CompileManager.doCompile
.calledWith(@request)
.should.equal true
it "should call the callback with the output files", ->
@callback.calledWithExactly(null, @output_files)
.should.equal true
describe "when the project is locked", ->
beforeEach ->
@error = new Error("locked")
@LockManager.runWithLock = (lockFile, runner, callback) =>
callback(@error)
@CompileManager.doCompileWithLock @request, @callback
it "should ensure that the compile directory exists", ->
@fse.ensureDir.calledWith(@compileDir)
.should.equal true
it "should not call doCompile with the request", ->
@CompileManager.doCompile
.called.should.equal false
it "should call the callback with the error", ->
@callback.calledWithExactly(@error)
.should.equal true
describe "doCompile", ->
beforeEach ->
@@ -41,54 +103,117 @@ describe "CompileManager", ->
@request =
resources: @resources = "mock-resources"
rootResourcePath: @rootResourcePath = "main.tex"
project_id: @project_id = "project-id-123"
project_id: @project_id
user_id: @user_id
compiler: @compiler = "pdflatex"
timeout: @timeout = 42000
imageName: @image = "example.com/image"
flags: @flags = ["-file-line-error"]
@env = {}
@Settings.compileDir = "compiles"
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}"
@ResourceWriter.syncResourcesToDisk = sinon.stub().callsArg(3)
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@ResourceWriter.syncResourcesToDisk = sinon.stub().callsArgWith(2, null, @resources)
@LatexRunner.runLatex = sinon.stub().callsArg(2)
@OutputFileFinder.findOutputFiles = sinon.stub().callsArgWith(2, null, @output_files)
@OutputCacheManager.saveOutputFiles = sinon.stub().callsArgWith(2, null, @build_files)
@CompileManager.doCompile @request, @callback
@DraftModeManager.injectDraftMode = sinon.stub().callsArg(1)
@TikzManager.checkMainFile = sinon.stub().callsArg(3, false)
it "should write the resources to disk", ->
@ResourceWriter.syncResourcesToDisk
.calledWith(@project_id, @resources, @compileDir)
.should.equal true
describe "normally", ->
beforeEach ->
@CompileManager.doCompile @request, @callback
it "should run LaTeX", ->
@LatexRunner.runLatex
.calledWith(@project_id, {
directory: @compileDir
mainFile: @rootResourcePath
compiler: @compiler
timeout: @timeout
})
.should.equal true
it "should write the resources to disk", ->
@ResourceWriter.syncResourcesToDisk
.calledWith(@request, @compileDir)
.should.equal true
it "should find the output files", ->
@OutputFileFinder.findOutputFiles
.calledWith(@resources, @compileDir)
.should.equal true
it "should run LaTeX", ->
@LatexRunner.runLatex
.calledWith("#{@project_id}-#{@user_id}", {
directory: @compileDir
mainFile: @rootResourcePath
compiler: @compiler
timeout: @timeout
image: @image
flags: @flags
environment: @env
})
.should.equal true
it "should return the output files", ->
@callback.calledWith(null, @build_files).should.equal true
it "should find the output files", ->
@OutputFileFinder.findOutputFiles
.calledWith(@resources, @compileDir)
.should.equal true
it "should return the output files", ->
@callback.calledWith(null, @build_files).should.equal true
it "should not inject draft mode by default", ->
@DraftModeManager.injectDraftMode.called.should.equal false
describe "with draft mode", ->
beforeEach ->
@request.draft = true
@CompileManager.doCompile @request, @callback
it "should inject the draft mode header", ->
@DraftModeManager.injectDraftMode
.calledWith(@compileDir + "/" + @rootResourcePath)
.should.equal true
describe "with a check option", ->
beforeEach ->
@request.check = "error"
@CompileManager.doCompile @request, @callback
it "should run chktex", ->
@LatexRunner.runLatex
.calledWith("#{@project_id}-#{@user_id}", {
directory: @compileDir
mainFile: @rootResourcePath
compiler: @compiler
timeout: @timeout
image: @image
flags: @flags
environment: {'CHKTEX_OPTIONS': '-nall -e9 -e10 -w15 -w16', 'CHKTEX_EXIT_ON_ERROR':1, 'CHKTEX_ULIMIT_OPTIONS': '-t 5 -v 64000'}
})
.should.equal true
describe "with a knitr file and check options", ->
beforeEach ->
@request.rootResourcePath = "main.Rtex"
@request.check = "error"
@CompileManager.doCompile @request, @callback
it "should not run chktex", ->
@LatexRunner.runLatex
.calledWith("#{@project_id}-#{@user_id}", {
directory: @compileDir
mainFile: "main.Rtex"
compiler: @compiler
timeout: @timeout
image: @image
flags: @flags
environment: @env
})
.should.equal true
describe "clearProject", ->
describe "succesfully", ->
beforeEach ->
@Settings.compileDir = "compiles"
@fs.lstat = sinon.stub().callsArgWith(1, null,{isDirectory: ()->true})
@proc = new EventEmitter()
@proc.stdout = new EventEmitter()
@proc.stderr = new EventEmitter()
@child_process.spawn = sinon.stub().returns(@proc)
@CompileManager.clearProject @project_id, @callback
@CompileManager.clearProject @project_id, @user_id, @callback
@proc.emit "close", 0
it "should remove the project directory", ->
@child_process.spawn
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}"])
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"])
.should.equal true
it "should call the callback", ->
@@ -97,17 +222,18 @@ describe "CompileManager", ->
describe "with a non-success status code", ->
beforeEach ->
@Settings.compileDir = "compiles"
@fs.lstat = sinon.stub().callsArgWith(1, null,{isDirectory: ()->true})
@proc = new EventEmitter()
@proc.stdout = new EventEmitter()
@proc.stderr = new EventEmitter()
@child_process.spawn = sinon.stub().returns(@proc)
@CompileManager.clearProject @project_id, @callback
@CompileManager.clearProject @project_id, @user_id, @callback
@proc.stderr.emit "data", @error = "oops"
@proc.emit "close", 1
it "should remove the project directory", ->
@child_process.spawn
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}"])
.calledWith("rm", ["-r", "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"])
.should.equal true
it "should call the callback with an error from the stderr", ->
@@ -115,7 +241,7 @@ describe "CompileManager", ->
.calledWith(new Error())
.should.equal true
@callback.args[0][0].message.should.equal "rm -r #{@Settings.path.compilesDir}/#{@project_id} failed: #{@error}"
@callback.args[0][0].message.should.equal "rm -r #{@Settings.path.compilesDir}/#{@project_id}-#{@user_id} failed: #{@error}"
describe "syncing", ->
beforeEach ->
@@ -128,20 +254,28 @@ describe "CompileManager", ->
@column = 3
@file_name = "main.tex"
@child_process.execFile = sinon.stub()
@Settings.path.synctexBaseDir = (project_id) => "#{@Settings.path.compilesDir}/#{@project_id}"
@Settings.path.synctexBaseDir = (project_id) => "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
describe "syncFromCode", ->
beforeEach ->
@child_process.execFile.callsArgWith(3, null, @stdout = "NODE\t#{@page}\t#{@h}\t#{@v}\t#{@width}\t#{@height}\n", "")
@CompileManager.syncFromCode @project_id, @file_name, @line, @column, @callback
@fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true})
@stdout = "NODE\t#{@page}\t#{@h}\t#{@v}\t#{@width}\t#{@height}\n"
@CommandRunner.run = sinon.stub().callsArgWith(6, null, {stdout:@stdout})
@CompileManager.syncFromCode @project_id, @user_id, @file_name, @line, @column, @callback
it "should execute the synctex binary", ->
bin_path = Path.resolve(__dirname + "/../../../bin/synctex")
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}/output.pdf"
file_path = "#{@Settings.path.compilesDir}/#{@project_id}/#{@file_name}"
@child_process.execFile
.calledWith(bin_path, ["code", synctex_path, file_path, @line, @column], timeout: 10000)
.should.equal true
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf"
file_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}"
@CommandRunner.run
.calledWith(
"#{@project_id}-#{@user_id}",
['/opt/synctex', 'code', synctex_path, file_path, @line, @column],
"#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}",
@Settings.clsi.docker.image,
60000,
{}
).should.equal true
it "should call the callback with the parsed output", ->
@callback
@@ -156,15 +290,22 @@ describe "CompileManager", ->
describe "syncFromPdf", ->
beforeEach ->
@child_process.execFile.callsArgWith(3, null, @stdout = "NODE\t#{@Settings.path.compilesDir}/#{@project_id}/#{@file_name}\t#{@line}\t#{@column}\n", "")
@CompileManager.syncFromPdf @project_id, @page, @h, @v, @callback
@fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true})
@stdout = "NODE\t#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}\t#{@line}\t#{@column}\n"
@CommandRunner.run = sinon.stub().callsArgWith(6, null, {stdout:@stdout})
@CompileManager.syncFromPdf @project_id, @user_id, @page, @h, @v, @callback
it "should execute the synctex binary", ->
bin_path = Path.resolve(__dirname + "/../../../bin/synctex")
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}/output.pdf"
@child_process.execFile
.calledWith(bin_path, ["pdf", synctex_path, @page, @h, @v], timeout: 10000)
.should.equal true
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf"
@CommandRunner.run
.calledWith(
"#{@project_id}-#{@user_id}",
['/opt/synctex', "pdf", synctex_path, @page, @h, @v],
"#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}",
@Settings.clsi.docker.image,
60000,
{}).should.equal true
it "should call the callback with the parsed output", ->
@callback
@@ -177,24 +318,25 @@ describe "CompileManager", ->
describe "wordcount", ->
beforeEach ->
@CommandRunner.run = sinon.stub().callsArg(4)
@fs.readFileSync = sinon.stub().returns @stdout = "Encoding: ascii\nWords in text: 2"
@CommandRunner.run = sinon.stub().callsArg(6)
@fs.readFile = sinon.stub().callsArgWith(2, null, @stdout = "Encoding: ascii\nWords in text: 2")
@callback = sinon.stub()
@project_id = "project-id-123"
@timeout = 10 * 1000
@project_id
@timeout = 60 * 1000
@file_name = "main.tex"
@Settings.path.compilesDir = "/local/compile/directory"
@image = "example.com/image"
@CompileManager.wordcount @project_id, @file_name, @callback
@CompileManager.wordcount @project_id, @user_id, @file_name, @image, @callback
it "should run the texcount command", ->
@directory = "#{@Settings.path.compilesDir}/#{@project_id}"
@directory = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@file_path = "$COMPILE_DIR/#{@file_name}"
@command =[ "texcount", "-inc", @file_path, "-out=" + @file_path + ".wc"]
@command =[ "texcount", "-nocol", "-inc", @file_path, "-out=" + @file_path + ".wc"]
@CommandRunner.run
.calledWith(@project_id, @command, @directory, @timeout)
.calledWith("#{@project_id}-#{@user_id}", @command, @directory, @image, @timeout, {})
.should.equal true
it "should call the callback with the parsed output", ->
@@ -208,5 +350,7 @@ describe "CompileManager", ->
elements: 0
mathInline: 0
mathDisplay: 0
errors: 0
messages: ""
})
.should.equal true

View File

@@ -0,0 +1,55 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/ContentTypeMapper'
describe 'ContentTypeMapper', ->
beforeEach ->
@ContentTypeMapper = SandboxedModule.require modulePath
describe 'map', ->
it 'should map .txt to text/plain', ->
content_type = @ContentTypeMapper.map('example.txt')
content_type.should.equal 'text/plain'
it 'should map .csv to text/csv', ->
content_type = @ContentTypeMapper.map('example.csv')
content_type.should.equal 'text/csv'
it 'should map .pdf to application/pdf', ->
content_type = @ContentTypeMapper.map('example.pdf')
content_type.should.equal 'application/pdf'
it 'should fall back to octet-stream', ->
content_type = @ContentTypeMapper.map('example.unknown')
content_type.should.equal 'application/octet-stream'
describe 'coercing web files to plain text', ->
it 'should map .js to plain text', ->
content_type = @ContentTypeMapper.map('example.js')
content_type.should.equal 'text/plain'
it 'should map .html to plain text', ->
content_type = @ContentTypeMapper.map('example.html')
content_type.should.equal 'text/plain'
it 'should map .css to plain text', ->
content_type = @ContentTypeMapper.map('example.css')
content_type.should.equal 'text/plain'
describe 'image files', ->
it 'should map .png to image/png', ->
content_type = @ContentTypeMapper.map('example.png')
content_type.should.equal 'image/png'
it 'should map .jpeg to image/jpeg', ->
content_type = @ContentTypeMapper.map('example.jpeg')
content_type.should.equal 'image/jpeg'
it 'should map .svg to text/plain to protect against XSS (SVG can execute JS)', ->
content_type = @ContentTypeMapper.map('example.svg')
content_type.should.equal 'text/plain'

View File

@@ -0,0 +1,145 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
require "coffee-script"
modulePath = require('path').join __dirname, '../../../app/coffee/DockerLockManager'
describe "LockManager", ->
beforeEach ->
@LockManager = SandboxedModule.require modulePath, requires:
"settings-sharelatex": @Settings =
clsi: docker: {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() }
describe "runWithLock", ->
describe "with a single lock", ->
beforeEach (done) ->
@callback = sinon.stub()
@LockManager.runWithLock "lock-one", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world")
, 100
, (err, args...) =>
@callback(err,args...)
done()
it "should call the callback", ->
@callback.calledWith(null,"hello","world").should.equal true
describe "with two locks", ->
beforeEach (done) ->
@callback1 = sinon.stub()
@callback2 = sinon.stub()
@LockManager.runWithLock "lock-one", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 100
, (err, args...) =>
@callback1(err,args...)
@LockManager.runWithLock "lock-two", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 200
, (err, args...) =>
@callback2(err,args...)
done()
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback", ->
@callback2.calledWith(null,"hello","world","two").should.equal true
describe "with lock contention", ->
describe "where the first lock is released quickly", ->
beforeEach (done) ->
@LockManager.MAX_LOCK_WAIT_TIME = 1000
@LockManager.LOCK_TEST_INTERVAL = 100
@callback1 = sinon.stub()
@callback2 = sinon.stub()
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 100
, (err, args...) =>
@callback1(err,args...)
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 200
, (err, args...) =>
@callback2(err,args...)
done()
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback", ->
@callback2.calledWith(null,"hello","world","two").should.equal true
describe "where the first lock is held longer than the waiting time", ->
beforeEach (done) ->
@LockManager.MAX_LOCK_HOLD_TIME = 10000
@LockManager.MAX_LOCK_WAIT_TIME = 1000
@LockManager.LOCK_TEST_INTERVAL = 100
@callback1 = sinon.stub()
@callback2 = sinon.stub()
doneOne = doneTwo = false
finish = (key) ->
doneOne = true if key is 1
doneTwo = true if key is 2
done() if doneOne and doneTwo
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 1100
, (err, args...) =>
@callback1(err,args...)
finish(1)
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 100
, (err, args...) =>
@callback2(err,args...)
finish(2)
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback with an error", ->
error = sinon.match.instanceOf Error
@callback2.calledWith(error).should.equal true
describe "where the first lock is held longer than the max holding time", ->
beforeEach (done) ->
@LockManager.MAX_LOCK_HOLD_TIME = 1000
@LockManager.MAX_LOCK_WAIT_TIME = 2000
@LockManager.LOCK_TEST_INTERVAL = 100
@callback1 = sinon.stub()
@callback2 = sinon.stub()
doneOne = doneTwo = false
finish = (key) ->
doneOne = true if key is 1
doneTwo = true if key is 2
done() if doneOne and doneTwo
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 1500
, (err, args...) =>
@callback1(err,args...)
finish(1)
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 100
, (err, args...) =>
@callback2(err,args...)
finish(2)
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback", ->
@callback2.calledWith(null,"hello","world","two").should.equal true

View File

@@ -0,0 +1,509 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
expect = require('chai').expect
require "coffee-script"
modulePath = require('path').join __dirname, '../../../app/coffee/DockerRunner'
Path = require "path"
describe "DockerRunner", ->
beforeEach ->
@container = container = {}
@DockerRunner = SandboxedModule.require modulePath, requires:
"settings-sharelatex": @Settings =
clsi: docker: {}
path: {}
"logger-sharelatex": @logger = {
log: sinon.stub(),
error: sinon.stub(),
info: sinon.stub(),
warn: sinon.stub()
}
"dockerode": class Docker
getContainer: sinon.stub().returns(container)
createContainer: sinon.stub().yields(null, container)
listContainers: sinon.stub()
"fs": @fs = { stat: sinon.stub().yields(null,{isDirectory:()->true}) }
"./Metrics":
Timer: class Timer
done: () ->
"./LockManager":
runWithLock: (key, runner, callback) -> runner(callback)
@Docker = Docker
@getContainer = Docker::getContainer
@createContainer = Docker::createContainer
@listContainers = Docker::listContainers
@directory = "/local/compile/directory"
@mainFile = "main-file.tex"
@compiler = "pdflatex"
@image = "example.com/sharelatex/image:2016.2"
@env = {}
@callback = sinon.stub()
@project_id = "project-id-123"
@volumes =
"/local/compile/directory": "/compile"
@Settings.clsi.docker.image = @defaultImage = "default-image"
@Settings.clsi.docker.env = PATH: "mock-path"
describe "run", ->
beforeEach (done)->
@DockerRunner._getContainerOptions = sinon.stub().returns(@options = {mockoptions: "foo"})
@DockerRunner._fingerprintContainer = sinon.stub().returns(@fingerprint = "fingerprint")
@name = "project-#{@project_id}-#{@fingerprint}"
@command = ["mock", "command", "--outdir=$COMPILE_DIR"]
@command_with_dir = ["mock", "command", "--outdir=/compile"]
@timeout = 42000
done()
describe "successfully", ->
beforeEach (done)->
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, (err, output)=>
@callback(err, output)
done()
it "should generate the options for the container", ->
@DockerRunner._getContainerOptions
.calledWith(@command_with_dir, @image, @volumes, @timeout)
.should.equal true
it "should generate the fingerprint from the returned options", ->
@DockerRunner._fingerprintContainer
.calledWith(@options)
.should.equal true
it "should do the run", ->
@DockerRunner._runAndWaitForContainer
.calledWith(@options, @volumes, @timeout)
.should.equal true
it "should call the callback", ->
@callback.calledWith(null, @output).should.equal true
describe 'when path.sandboxedCompilesHostDir is set', ->
beforeEach ->
@Settings.path.sandboxedCompilesHostDir = '/some/host/dir/compiles'
@directory = '/var/lib/sharelatex/data/compiles/xyz'
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, @callback
it 'should re-write the bind directory', ->
volumes = @DockerRunner._runAndWaitForContainer.lastCall.args[1]
expect(volumes).to.deep.equal {
'/some/host/dir/compiles/xyz': '/compile'
}
it "should call the callback", ->
@callback.calledWith(null, @output).should.equal true
describe "when the run throws an error", ->
beforeEach ->
firstTime = true
@output = "mock-output"
@DockerRunner._runAndWaitForContainer = (options, volumes, timeout, callback = (error, output)->) =>
if firstTime
firstTime = false
callback new Error("HTTP code is 500 which indicates error: server error")
else
callback(null, @output)
sinon.spy @DockerRunner, "_runAndWaitForContainer"
@DockerRunner.destroyContainer = sinon.stub().callsArg(3)
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, @callback
it "should do the run twice", ->
@DockerRunner._runAndWaitForContainer
.calledTwice.should.equal true
it "should destroy the container in between", ->
@DockerRunner.destroyContainer
.calledWith(@name, null)
.should.equal true
it "should call the callback", ->
@callback.calledWith(null, @output).should.equal true
describe "with no image", ->
beforeEach ->
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, null, @timeout, @env, @callback
it "should use the default image", ->
@DockerRunner._getContainerOptions
.calledWith(@command_with_dir, @defaultImage, @volumes, @timeout)
.should.equal true
describe "with image override", ->
beforeEach ->
@Settings.texliveImageNameOveride = "overrideimage.com/something"
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, @callback
it "should use the override and keep the tag", ->
image = @DockerRunner._getContainerOptions.args[0][1]
image.should.equal "overrideimage.com/something/image:2016.2"
describe "_runAndWaitForContainer", ->
beforeEach ->
@options = {mockoptions: "foo", name: @name = "mock-name"}
@DockerRunner.startContainer = (options, volumes, attachStreamHandler, callback) =>
attachStreamHandler(null, @output = "mock-output")
callback(null, @containerId = "container-id")
sinon.spy @DockerRunner, "startContainer"
@DockerRunner.waitForContainer = sinon.stub().callsArgWith(2, null, @exitCode = 42)
@DockerRunner._runAndWaitForContainer @options, @volumes, @timeout, @callback
it "should create/start the container", ->
@DockerRunner.startContainer
.calledWith(@options, @volumes)
.should.equal true
it "should wait for the container to finish", ->
@DockerRunner.waitForContainer
.calledWith(@name, @timeout)
.should.equal true
it "should call the callback with the output", ->
@callback.calledWith(null, @output).should.equal true
describe "startContainer", ->
beforeEach ->
@attachStreamHandler = sinon.stub()
@attachStreamHandler.cock = true
@options = {mockoptions: "foo", name: "mock-name"}
@container.inspect = sinon.stub().callsArgWith(0)
@DockerRunner.attachToContainer = (containerId, attachStreamHandler, cb)=>
attachStreamHandler()
cb()
sinon.spy @DockerRunner, "attachToContainer"
describe "when the container exists", ->
beforeEach ->
@container.inspect = sinon.stub().callsArgWith(0)
@container.start = sinon.stub().yields()
@DockerRunner.startContainer @options, @volumes, @callback, ->
it "should start the container with the given name", ->
@getContainer
.calledWith(@options.name)
.should.equal true
@container.start
.called
.should.equal true
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should attach to the container", ->
@DockerRunner.attachToContainer.called.should.equal true
it "should call the callback", ->
@callback.called.should.equal true
it "should attach before the container starts", ->
sinon.assert.callOrder(@DockerRunner.attachToContainer, @container.start)
describe "when the container does not exist", ->
beforeEach ()->
exists = false
@container.start = sinon.stub().yields()
@container.inspect = sinon.stub().callsArgWith(0, {statusCode:404})
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should create the container", ->
@createContainer
.calledWith(@options)
.should.equal true
it "should call the callback and stream handler", ->
@attachStreamHandler.called.should.equal true
@callback.called.should.equal true
it "should attach to the container", ->
@DockerRunner.attachToContainer.called.should.equal true
it "should attach before the container starts", ->
sinon.assert.callOrder(@DockerRunner.attachToContainer, @container.start)
describe "when the container is already running", ->
beforeEach ->
error = new Error("HTTP code is 304 which indicates error: server error - start: Cannot start container #{@name}: The container MOCKID is already running.")
error.statusCode = 304
@container.start = sinon.stub().yields(error)
@container.inspect = sinon.stub().callsArgWith(0)
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback and stream handler without an error", ->
@attachStreamHandler.called.should.equal true
@callback.called.should.equal true
describe "when a volume does not exist", ->
beforeEach ()->
@fs.stat = sinon.stub().yields(new Error("no such path"))
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback with an error", ->
@callback.calledWith(new Error()).should.equal true
describe "when a volume exists but is not a directory", ->
beforeEach ->
@fs.stat = sinon.stub().yields(null, {isDirectory: () -> return false})
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback with an error", ->
@callback.calledWith(new Error()).should.equal true
describe "when a volume does not exist, but sibling-containers are used", ->
beforeEach ->
@fs.stat = sinon.stub().yields(new Error("no such path"))
@Settings.path.sandboxedCompilesHostDir = '/some/path'
@container.start = sinon.stub().yields()
@DockerRunner.startContainer @options, @volumes, @callback
afterEach ->
delete @Settings.path.sandboxedCompilesHostDir
it "should start the container with the given name", ->
@getContainer
.calledWith(@options.name)
.should.equal true
@container.start
.called
.should.equal true
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback", ->
@callback.called.should.equal true
@callback.calledWith(new Error()).should.equal false
describe "when the container tries to be created, but already has been (race condition)", ->
describe "waitForContainer", ->
beforeEach ->
@containerId = "container-id"
@timeout = 5000
@container.wait = sinon.stub().yields(null, StatusCode: @statusCode = 42)
@container.kill = sinon.stub().yields()
describe "when the container returns in time", ->
beforeEach ->
@DockerRunner.waitForContainer @containerId, @timeout, @callback
it "should wait for the container", ->
@getContainer
.calledWith(@containerId)
.should.equal true
@container.wait
.called
.should.equal true
it "should call the callback with the exit", ->
@callback
.calledWith(null, @statusCode)
.should.equal true
describe "when the container does not return before the timeout", ->
beforeEach (done) ->
@container.wait = (callback = (error, exitCode) ->) ->
setTimeout () ->
callback(null, StatusCode: 42)
, 100
@timeout = 5
@DockerRunner.waitForContainer @containerId, @timeout, (args...) =>
@callback(args...)
done()
it "should call kill on the container", ->
@getContainer
.calledWith(@containerId)
.should.equal true
@container.kill
.called
.should.equal true
it "should call the callback with an error", ->
error = new Error("container timed out")
error.timedout = true
@callback
.calledWith(error)
.should.equal true
describe "destroyOldContainers", ->
beforeEach (done) ->
oneHourInSeconds = 60 * 60
oneHourInMilliseconds = oneHourInSeconds * 1000
nowInSeconds = Date.now()/1000
@containers = [{
Name: "/project-old-container-name"
Id: "old-container-id"
Created: nowInSeconds - oneHourInSeconds - 100
}, {
Name: "/project-new-container-name"
Id: "new-container-id"
Created: nowInSeconds - oneHourInSeconds + 100
}, {
Name: "/totally-not-a-project-container"
Id: "some-random-id"
Created: nowInSeconds - (2 * oneHourInSeconds )
}]
@DockerRunner.MAX_CONTAINER_AGE = oneHourInMilliseconds
@listContainers.callsArgWith(1, null, @containers)
@DockerRunner.destroyContainer = sinon.stub().callsArg(3)
@DockerRunner.destroyOldContainers (error) =>
@callback(error)
done()
it "should list all containers", ->
@listContainers
.calledWith(all: true)
.should.equal true
it "should destroy old containers", ->
@DockerRunner.destroyContainer
.callCount
.should.equal 1
@DockerRunner.destroyContainer
.calledWith("/project-old-container-name", "old-container-id")
.should.equal true
it "should not destroy new containers", ->
@DockerRunner.destroyContainer
.calledWith("/project-new-container-name", "new-container-id")
.should.equal false
it "should not destroy non-project containers", ->
@DockerRunner.destroyContainer
.calledWith("/totally-not-a-project-container", "some-random-id")
.should.equal false
it "should callback the callback", ->
@callback.called.should.equal true
describe '_destroyContainer', ->
beforeEach ->
@containerId = 'some_id'
@fakeContainer =
remove: sinon.stub().callsArgWith(1, null)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should get the container', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
@Docker::getContainer.callCount.should.equal 1
@Docker::getContainer.calledWith(@containerId).should.equal true
done()
it 'should try to force-destroy the container when shouldForce=true', (done) ->
@DockerRunner._destroyContainer @containerId, true, (err) =>
@fakeContainer.remove.callCount.should.equal 1
@fakeContainer.remove.calledWith({force: true}).should.equal true
done()
it 'should not try to force-destroy the container when shouldForce=false', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
@fakeContainer.remove.callCount.should.equal 1
@fakeContainer.remove.calledWith({force: false}).should.equal true
done()
it 'should not produce an error', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
expect(err).to.equal null
done()
describe 'when the container is already gone', ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 404
@fakeContainer =
remove: sinon.stub().callsArgWith(1, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should not produce an error', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
expect(err).to.equal null
done()
describe 'when container.destroy produces an error', (done) ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 500
@fakeContainer =
remove: sinon.stub().callsArgWith(1, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should produce an error', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
expect(err).to.not.equal null
expect(err).to.equal @fakeError
done()
describe 'kill', ->
beforeEach ->
@containerId = 'some_id'
@fakeContainer =
kill: sinon.stub().callsArgWith(0, null)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should get the container', (done) ->
@DockerRunner.kill @containerId, (err) =>
@Docker::getContainer.callCount.should.equal 1
@Docker::getContainer.calledWith(@containerId).should.equal true
done()
it 'should try to force-destroy the container', (done) ->
@DockerRunner.kill @containerId, (err) =>
@fakeContainer.kill.callCount.should.equal 1
done()
it 'should not produce an error', (done) ->
@DockerRunner.kill @containerId, (err) =>
expect(err).to.equal undefined
done()
describe 'when the container is not actually running', ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 500
@fakeError.message = 'Cannot kill container <whatever> is not running'
@fakeContainer =
kill: sinon.stub().callsArgWith(0, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should not produce an error', (done) ->
@DockerRunner.kill @containerId, (err) =>
expect(err).to.equal undefined
done()
describe 'when container.kill produces a legitimate error', (done) ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 500
@fakeError.message = 'Totally legitimate reason to throw an error'
@fakeContainer =
kill: sinon.stub().callsArgWith(0, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should produce an error', (done) ->
@DockerRunner.kill @containerId, (err) =>
expect(err).to.not.equal undefined
expect(err).to.equal @fakeError
done()

View File

@@ -0,0 +1,61 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/DraftModeManager'
describe 'DraftModeManager', ->
beforeEach ->
@DraftModeManager = SandboxedModule.require modulePath, requires:
"fs": @fs = {}
"logger-sharelatex": @logger = {log: () ->}
describe "_injectDraftOption", ->
it "should add draft option into documentclass with existing options", ->
@DraftModeManager
._injectDraftOption('''
\\documentclass[a4paper,foo=bar]{article}
''')
.should.equal('''
\\documentclass[draft,a4paper,foo=bar]{article}
''')
it "should add draft option into documentclass with no options", ->
@DraftModeManager
._injectDraftOption('''
\\documentclass{article}
''')
.should.equal('''
\\documentclass[draft]{article}
''')
describe "injectDraftMode", ->
beforeEach ->
@filename = "/mock/filename.tex"
@callback = sinon.stub()
content = '''
\\documentclass{article}
\\begin{document}
Hello world
\\end{document}
'''
@fs.readFile = sinon.stub().callsArgWith(2, null, content)
@fs.writeFile = sinon.stub().callsArg(2)
@DraftModeManager.injectDraftMode @filename, @callback
it "should read the file", ->
@fs.readFile
.calledWith(@filename, "utf8")
.should.equal true
it "should write the modified file", ->
@fs.writeFile
.calledWith(@filename, """
\\documentclass[draft]{article}
\\begin{document}
Hello world
\\end{document}
""")
.should.equal true
it "should call the callback", ->
@callback.called.should.equal true

View File

@@ -19,12 +19,14 @@ describe "LatexRunner", ->
@directory = "/local/compile/directory"
@mainFile = "main-file.tex"
@compiler = "pdflatex"
@image = "example.com/image"
@callback = sinon.stub()
@project_id = "project-id-123"
@env = {'foo': '123'}
describe "runLatex", ->
beforeEach ->
@CommandRunner.run = sinon.stub().callsArg(4)
@CommandRunner.run = sinon.stub().callsArg(6)
describe "normally", ->
beforeEach ->
@@ -33,11 +35,13 @@ describe "LatexRunner", ->
mainFile: @mainFile
compiler: @compiler
timeout: @timeout = 42000
image: @image
environment: @env
@callback
it "should run the latex command", ->
@CommandRunner.run
.calledWith(@project_id, sinon.match.any, @directory, @timeout)
.calledWith(@project_id, sinon.match.any, @directory, @image, @timeout, @env)
.should.equal true
describe "with an .Rtex main file", ->
@@ -46,6 +50,7 @@ describe "LatexRunner", ->
directory: @directory
mainFile: "main-file.Rtex"
compiler: @compiler
image: @image
timeout: @timeout = 42000
@callback
@@ -54,3 +59,21 @@ describe "LatexRunner", ->
mainFile = command.slice(-1)[0]
mainFile.should.equal "$COMPILE_DIR/main-file.tex"
describe "with a flags option", ->
beforeEach ->
@LatexRunner.runLatex @project_id,
directory: @directory
mainFile: @mainFile
compiler: @compiler
image: @image
timeout: @timeout = 42000
flags: ["-file-line-error", "-halt-on-error"]
@callback
it "should include the flags in the command", ->
command = @CommandRunner.run.args[0][1]
flags = command.filter (arg) ->
(arg == "-file-line-error") || (arg == "-halt-on-error")
flags.length.should.equal 2
flags[0].should.equal "-file-line-error"
flags[1].should.equal "-halt-on-error"

View File

@@ -0,0 +1,57 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/LockManager'
Path = require "path"
Errors = require "../../../app/js/Errors"
describe "DockerLockManager", ->
beforeEach ->
@LockManager = SandboxedModule.require modulePath, requires:
"settings-sharelatex": {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub(), err:-> }
"fs":
lstat:sinon.stub().callsArgWith(1)
readdir: sinon.stub().callsArgWith(1)
"lockfile": @Lockfile = {}
@lockFile = "/local/compile/directory/.project-lock"
describe "runWithLock", ->
beforeEach ->
@runner = sinon.stub().callsArgWith(0, null, "foo", "bar")
@callback = sinon.stub()
describe "normally", ->
beforeEach ->
@Lockfile.lock = sinon.stub().callsArgWith(2, null)
@Lockfile.unlock = sinon.stub().callsArgWith(1, null)
@LockManager.runWithLock @lockFile, @runner, @callback
it "should run the compile", ->
@runner
.calledWith()
.should.equal true
it "should call the callback with the response from the compile", ->
@callback
.calledWithExactly(null, "foo", "bar")
.should.equal true
describe "when the project is locked", ->
beforeEach ->
@error = new Error()
@error.code = "EEXIST"
@Lockfile.lock = sinon.stub().callsArgWith(2,@error)
@Lockfile.unlock = sinon.stub().callsArgWith(1, null)
@LockManager.runWithLock @lockFile, @runner, @callback
it "should not run the compile", ->
@runner
.called
.should.equal false
it "should return an error", ->
error = new Errors.AlreadyCompilingError()
@callback
.calledWithExactly(error)
.should.equal true

View File

@@ -0,0 +1,103 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/OutputFileOptimiser'
path = require "path"
expect = require("chai").expect
EventEmitter = require("events").EventEmitter
describe "OutputFileOptimiser", ->
beforeEach ->
@OutputFileOptimiser = SandboxedModule.require modulePath, requires:
"fs": @fs = {}
"path": @Path = {}
"child_process": spawn: @spawn = sinon.stub()
"logger-sharelatex": { log: sinon.stub(), warn: sinon.stub() }
"./Metrics" : {}
@directory = "/test/dir"
@callback = sinon.stub()
describe "optimiseFile", ->
beforeEach ->
@src = "./output.pdf"
@dst = "./output.pdf"
describe "when the file is not a pdf file", ->
beforeEach (done)->
@src = "./output.log"
@OutputFileOptimiser.checkIfPDFIsOptimised = sinon.stub().callsArgWith(1, null, false)
@OutputFileOptimiser.optimisePDF = sinon.stub().callsArgWith(2, null)
@OutputFileOptimiser.optimiseFile @src, @dst, done
it "should not check if the file is optimised", ->
@OutputFileOptimiser.checkIfPDFIsOptimised.calledWith(@src).should.equal false
it "should not optimise the file", ->
@OutputFileOptimiser.optimisePDF.calledWith(@src, @dst).should.equal false
describe "when the pdf file is not optimised", ->
beforeEach (done) ->
@OutputFileOptimiser.checkIfPDFIsOptimised = sinon.stub().callsArgWith(1, null, false)
@OutputFileOptimiser.optimisePDF = sinon.stub().callsArgWith(2, null)
@OutputFileOptimiser.optimiseFile @src, @dst, done
it "should check if the pdf is optimised", ->
@OutputFileOptimiser.checkIfPDFIsOptimised.calledWith(@src).should.equal true
it "should optimise the pdf", ->
@OutputFileOptimiser.optimisePDF.calledWith(@src, @dst).should.equal true
describe "when the pdf file is optimised", ->
beforeEach (done) ->
@OutputFileOptimiser.checkIfPDFIsOptimised = sinon.stub().callsArgWith(1, null, true)
@OutputFileOptimiser.optimisePDF = sinon.stub().callsArgWith(2, null)
@OutputFileOptimiser.optimiseFile @src, @dst, done
it "should check if the pdf is optimised", ->
@OutputFileOptimiser.checkIfPDFIsOptimised.calledWith(@src).should.equal true
it "should not optimise the pdf", ->
@OutputFileOptimiser.optimisePDF.calledWith(@src, @dst).should.equal false
describe "checkIfPDFISOptimised", ->
beforeEach () ->
@callback = sinon.stub()
@fd = 1234
@fs.open = sinon.stub().yields(null, @fd)
@fs.read = sinon.stub().withArgs(@fd).yields(null, 100, new Buffer("hello /Linearized 1"))
@fs.close = sinon.stub().withArgs(@fd).yields(null)
@OutputFileOptimiser.checkIfPDFIsOptimised @src, @callback
describe "for a linearised file", ->
beforeEach () ->
@fs.read = sinon.stub().withArgs(@fd).yields(null, 100, new Buffer("hello /Linearized 1"))
@OutputFileOptimiser.checkIfPDFIsOptimised @src, @callback
it "should open the file", ->
@fs.open.calledWith(@src, "r").should.equal true
it "should read the header", ->
@fs.read.calledWith(@fd).should.equal true
it "should close the file", ->
@fs.close.calledWith(@fd).should.equal true
it "should call the callback with a true result", ->
@callback.calledWith(null, true).should.equal true
describe "for an unlinearised file", ->
beforeEach () ->
@fs.read = sinon.stub().withArgs(@fd).yields(null, 100, new Buffer("hello not linearized 1"))
@OutputFileOptimiser.checkIfPDFIsOptimised @src, @callback
it "should open the file", ->
@fs.open.calledWith(@src, "r").should.equal true
it "should read the header", ->
@fs.read.calledWith(@fd).should.equal true
it "should close the file", ->
@fs.close.calledWith(@fd).should.equal true
it "should call the callback with a false result", ->
@callback.calledWith(null, false).should.equal true

View File

@@ -13,6 +13,7 @@ describe "ProjectPersistenceManager", ->
"./db": @db = {}
@callback = sinon.stub()
@project_id = "project-id-123"
@user_id = "1234"
describe "clearExpiredProjects", ->
beforeEach ->
@@ -21,12 +22,13 @@ describe "ProjectPersistenceManager", ->
"project-id-2"
]
@ProjectPersistenceManager._findExpiredProjectIds = sinon.stub().callsArgWith(0, null, @project_ids)
@ProjectPersistenceManager.clearProject = sinon.stub().callsArg(1)
@ProjectPersistenceManager.clearProjectFromCache = sinon.stub().callsArg(1)
@CompileManager.clearExpiredProjects = sinon.stub().callsArg(1)
@ProjectPersistenceManager.clearExpiredProjects @callback
it "should clear each expired project", ->
for project_id in @project_ids
@ProjectPersistenceManager.clearProject
@ProjectPersistenceManager.clearProjectFromCache
.calledWith(project_id)
.should.equal true
@@ -37,8 +39,8 @@ describe "ProjectPersistenceManager", ->
beforeEach ->
@ProjectPersistenceManager._clearProjectFromDatabase = sinon.stub().callsArg(1)
@UrlCache.clearProject = sinon.stub().callsArg(1)
@CompileManager.clearProject = sinon.stub().callsArg(1)
@ProjectPersistenceManager.clearProject @project_id, @callback
@CompileManager.clearProject = sinon.stub().callsArg(2)
@ProjectPersistenceManager.clearProject @project_id, @user_id, @callback
it "should clear the project from the database", ->
@ProjectPersistenceManager._clearProjectFromDatabase
@@ -52,7 +54,7 @@ describe "ProjectPersistenceManager", ->
it "should clear the project compile folder", ->
@CompileManager.clearProject
.calledWith(@project_id)
.calledWith(@project_id, @user_id)
.should.equal true
it "should call the callback", ->

View File

@@ -1,6 +1,7 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
expect = require('chai').expect
modulePath = require('path').join __dirname, '../../../app/js/RequestParser'
tk = require("timekeeper")
@@ -16,10 +17,12 @@ describe "RequestParser", ->
compile:
token: "token-123"
options:
imageName: "basicImageName/here:2017-1"
compiler: "pdflatex"
timeout: 42
resources: []
@RequestParser = SandboxedModule.require modulePath
@RequestParser = SandboxedModule.require modulePath, requires:
"settings-sharelatex": @settings = {}
afterEach ->
tk.reset()
@@ -57,6 +60,28 @@ describe "RequestParser", ->
it "should set the compiler to pdflatex by default", ->
@data.compiler.should.equal "pdflatex"
describe "with imageName set", ->
beforeEach ->
@RequestParser.parse @validRequest, (error, @data) =>
it "should set the imageName", ->
@data.imageName.should.equal "basicImageName/here:2017-1"
describe "with flags set", ->
beforeEach ->
@validRequest.compile.options.flags = ["-file-line-error"]
@RequestParser.parse @validRequest, (error, @data) =>
it "should set the flags attribute", ->
expect(@data.flags).to.deep.equal ["-file-line-error"]
describe "with flags not specified", ->
beforeEach ->
@RequestParser.parse @validRequest, (error, @data) =>
it "it should have an empty flags list", ->
expect(@data.flags).to.deep.equal []
describe "without a timeout specified", ->
beforeEach ->
delete @validRequest.compile.options.timeout
@@ -206,11 +231,49 @@ describe "RequestParser", ->
describe "with a root resource path that needs escaping", ->
beforeEach ->
@validRequest.compile.rootResourcePath = "`rm -rf foo`.tex"
@badPath = "`rm -rf foo`.tex"
@goodPath = "rm -rf foo.tex"
@validRequest.compile.rootResourcePath = @badPath
@validRequest.compile.resources.push {
path: @badPath
date: "12:00 01/02/03"
content: "Hello world"
}
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return the escaped resource", ->
@data.rootResourcePath.should.equal "rm -rf foo.tex"
@data.rootResourcePath.should.equal @goodPath
it "should also escape the resource path", ->
@data.resources[0].path.should.equal @goodPath
describe "with a root resource path that has a relative path", ->
beforeEach ->
@validRequest.compile.rootResourcePath = "foo/../../bar.tex"
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return an error", ->
@callback.calledWith("relative path in root resource")
.should.equal true
describe "with a root resource path that has unescaped + relative path", ->
beforeEach ->
@validRequest.compile.rootResourcePath = "foo/#../bar.tex"
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return an error", ->
@callback.calledWith("relative path in root resource")
.should.equal true
describe "with an unknown syncType", ->
beforeEach ->
@validRequest.compile.options.syncType = "unexpected"
@RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1]
it "should return an error", ->
@callback.calledWith("syncType attribute should be one of: full, incremental")
.should.equal true

View File

@@ -0,0 +1,109 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
should = require('chai').should()
modulePath = require('path').join __dirname, '../../../app/js/ResourceStateManager'
Path = require "path"
Errors = require "../../../app/js/Errors"
describe "ResourceStateManager", ->
beforeEach ->
@ResourceStateManager = SandboxedModule.require modulePath, requires:
"fs": @fs = {}
"logger-sharelatex": {log: sinon.stub(), err: sinon.stub()}
"./SafeReader": @SafeReader = {}
@basePath = "/path/to/write/files/to"
@resources = [
{path: "resource-1-mock"}
{path: "resource-2-mock"}
{path: "resource-3-mock"}
]
@state = "1234567890"
@resourceFileName = "#{@basePath}/.project-sync-state"
@resourceFileContents = "#{@resources[0].path}\n#{@resources[1].path}\n#{@resources[2].path}\nstateHash:#{@state}"
@callback = sinon.stub()
describe "saveProjectState", ->
beforeEach ->
@fs.writeFile = sinon.stub().callsArg(2)
describe "when the state is specified", ->
beforeEach ->
@ResourceStateManager.saveProjectState(@state, @resources, @basePath, @callback)
it "should write the resource list to disk", ->
@fs.writeFile
.calledWith(@resourceFileName, @resourceFileContents)
.should.equal true
it "should call the callback", ->
@callback.called.should.equal true
describe "when the state is undefined", ->
beforeEach ->
@state = undefined
@fs.unlink = sinon.stub().callsArg(1)
@ResourceStateManager.saveProjectState(@state, @resources, @basePath, @callback)
it "should unlink the resource file", ->
@fs.unlink
.calledWith(@resourceFileName)
.should.equal true
it "should not write the resource list to disk", ->
@fs.writeFile.called.should.equal false
it "should call the callback", ->
@callback.called.should.equal true
describe "checkProjectStateMatches", ->
describe "when the state matches", ->
beforeEach ->
@SafeReader.readFile = sinon.stub().callsArgWith(3, null, @resourceFileContents)
@ResourceStateManager.checkProjectStateMatches(@state, @basePath, @callback)
it "should read the resource file", ->
@SafeReader.readFile
.calledWith(@resourceFileName)
.should.equal true
it "should call the callback with the results", ->
@callback.calledWithMatch(null, @resources).should.equal true
describe "when the state does not match", ->
beforeEach ->
@SafeReader.readFile = sinon.stub().callsArgWith(3, null, @resourceFileContents)
@ResourceStateManager.checkProjectStateMatches("not-the-original-state", @basePath, @callback)
it "should call the callback with an error", ->
error = new Errors.FilesOutOfSyncError("invalid state for incremental update")
@callback.calledWith(error).should.equal true
describe "checkResourceFiles", ->
describe "when all the files are present", ->
beforeEach ->
@allFiles = [ @resources[0].path, @resources[1].path, @resources[2].path]
@ResourceStateManager.checkResourceFiles(@resources, @allFiles, @basePath, @callback)
it "should call the callback", ->
@callback.calledWithExactly().should.equal true
describe "when there is a missing file", ->
beforeEach ->
@allFiles = [ @resources[0].path, @resources[1].path]
@fs.stat = sinon.stub().callsArgWith(1, new Error())
@ResourceStateManager.checkResourceFiles(@resources, @allFiles, @basePath, @callback)
it "should call the callback with an error", ->
error = new Errors.FilesOutOfSyncError("resource files missing in incremental update")
@callback.calledWith(error).should.equal true
describe "when a resource contains a relative path", ->
beforeEach ->
@resources[0].path = "../foo/bar.tex"
@allFiles = [ @resources[0].path, @resources[1].path, @resources[2].path]
@ResourceStateManager.checkResourceFiles(@resources, @allFiles, @basePath, @callback)
it "should call the callback with an error", ->
@callback.calledWith(new Error("relative path in resource file list")).should.equal true

Some files were not shown because too many files have changed in this diff Show More