235 Commits

Author SHA1 Message Date
Christopher Hoskin
a47237ccb7 Explicitly start build immediately 2019-10-25 16:54:36 +01:00
Christopher Hoskin
ab82140128 Try some parallelism 2019-10-25 16:42:33 +01:00
Christopher Hoskin
4d81b2ca53 Another go at printing out times 2019-10-25 11:55:57 +01:00
Christopher Hoskin
c1eca448c5 Need bash for pipefail 2019-10-25 11:21:24 +01:00
Christopher Hoskin
d04df4ed75 Try to get more times 2019-10-25 10:42:21 +01:00
Christopher Hoskin
d9f487efc4 Remove bin/install_texlive_gce.sh 2019-10-25 10:10:14 +01:00
Christopher Hoskin
02b5cc8efd Revert "Remove TEXLIVE_IMAGE env var - expect failure"
This reverts commit 3ab407b91a.
2019-10-24 17:08:43 +01:00
Christopher Hoskin
f3c6756294 Try to narrow down where the delay is 2019-10-24 14:52:28 +01:00
Christopher Hoskin
3ab407b91a Remove TEXLIVE_IMAGE env var - expect failure 2019-10-24 12:13:29 +01:00
Christopher Hoskin
0b40c8f79d Try a different texlive image 2019-10-24 10:58:04 +01:00
Christopher Hoskin
882732d6a5 Try build on bigger machine 2019-10-23 21:42:09 +01:00
Christopher Hoskin
d934f96370 Increase timeout 2019-10-23 19:03:15 +01:00
Christopher Hoskin
0f84c47bbe Try again with gcr texlive image 2019-10-23 17:08:13 +01:00
Christopher Hoskin
fdd87d77cc Try env pass-through 2019-10-23 16:35:59 +01:00
Christopher Hoskin
7ebc9b43a1 Bump buildscript to 1.1.22 2019-10-23 16:28:23 +01:00
Christopher Hoskin
4c4dd64ca6 Try again 2019-10-23 14:31:15 +01:00
Christopher Hoskin
693b9e6193 Use gcr.io texlive image 2019-10-23 14:26:37 +01:00
Christopher Hoskin
65d416ee10 Add quay.io/sharelatex/texlive-full:2017.1 as a custom builder 2019-10-23 12:03:11 +01:00
Christopher Hoskin
8f70dbd67b Start trying to figure out CLSI cloudbuild 2019-10-23 12:00:15 +01:00
Miguel Serrano
a62ff6e248 Merge pull request #131 from yuantailing/fix-compiler-manager
Fix synctex for LocalCommandRunner
2019-10-08 12:48:13 +02:00
Tailing Yuan
481a49a587 fix CompileManager and LocalCommandRunner 2019-10-04 23:02:03 +08:00
Shane Kilkelly
2675fa033e Merge pull request #128 from overleaf/sk-dep-upgrades-2
Update logger, metrics
2019-07-11 12:51:16 +01:00
Shane Kilkelly
dc6af8799f update logger and metrics 2019-06-18 16:29:20 +01:00
Shane Kilkelly
61bed0da2b Merge pull request #126 from overleaf/sk-increase-hard-timeout
Increase the hard-timeout to 10 minutes.
2019-06-10 09:44:48 +01:00
Shane Kilkelly
4f6ef61626 Increase the hard-timeout to 10 minutes.
In practice most projects will still be limited to five minutes,
but this allows us to bump up the limit for some projects,
especially legacy v1 projects that have been imported to v2
2019-06-06 16:39:16 +01:00
Brian Gough
ada07ad2c3 Merge pull request #120 from das7pad/hotfix/docker-group
[docker] add support for a different docker group id on the docker host
2019-05-16 14:04:27 +01:00
Brian Gough
bc530c70e2 Merge pull request #119 from overleaf/bg-increase-acceptance-test-timeout
increase timeout for long-running acceptance tests
2019-05-16 09:17:26 +01:00
Michael Mazour
db00288bb9 Merge pull request #125 from overleaf/mm-flags-in-request
Add flags option to request JSON
2019-05-15 14:06:47 +01:00
Michael Mazour
663ec88718 Add flags option to request JSON
Adds a `flags` parameter to the request JSON, appearing under the `compile.options` key (alongside such stalwarts as `compiler`, `timeout`, etc.).

This is primarily to support `-file-line-error` as an option, but could have other uses as well.

`flags` should be an array of strings, or absent. If supplied, the listed arguments are added to the base latexmk command.
2019-05-14 16:24:34 +01:00
Tim Alby
03047f45af update Git URL in Jenkinsfile 2019-05-07 18:31:54 +02:00
Timothée Alby
11cf8a98fa Update README.md 2019-05-07 16:41:17 +01:00
Christopher Hoskin
d2c2629ef5 Bump buildscripts from 1.1.11 to 1.1.20 2019-05-03 10:29:38 +01:00
Jakob Ackermann
adfeffd254 [docker] add support for a different docker group id on the docker host
Signed-off-by: Jakob Ackermann <das7pad@outlook.com>
2019-04-23 01:53:40 +02:00
Brian Gough
bd42fe5776 increase timeout for long-running acceptance tests 2019-04-01 09:42:54 +01:00
Christopher Hoskin
3200161308 Merge pull request #116 from sharelatex/csh-formalise-node-10.15
Formalise node 10.15 update
2019-03-28 11:59:08 +00:00
Christopher Hoskin
9cb14660d4 Formalise node 10.15 update 2019-03-26 11:50:59 +00:00
Henry Oswald
31153c479c change console.log for logger.log 2019-03-22 20:42:26 +00:00
Christopher Hoskin
f422bb8011 Merge pull request #113 from sharelatex/ho-osx-epoll
add epoll_pwait to secomp profile
2019-03-04 14:57:01 +00:00
Christopher Hoskin
25c4c349d7 Merge pull request #115 from sharelatex/csh-issue-204-clsi-log-stackdriver
Bump logger to v1.6.0
2019-03-04 14:56:17 +00:00
Christopher Hoskin
e2377e1c1c Bump logger to v1.6.0 2019-03-04 12:05:28 +00:00
Brian Gough
1899d27732 increase acceptance test timeout to 1 minute 2019-02-22 13:58:12 +00:00
Brian Gough
9bf3795ceb Merge pull request #114 from sharelatex/bg-avoid-text-html-content-type-in-responses
use explicit json content-type to avoid security issues with text/html
2019-02-22 11:35:24 +00:00
Brian Gough
d20856f799 use explicit json content-type to avoid security issues with text/html 2019-02-12 16:54:59 +00:00
Henry Oswald
12fee9e4df add epoll_pwait to secomp profile
Last year golang changed from epoll_wait to epoll_pwait https://github.com/golang/go/issues/23750

This causes golang panic errors on mac when running secomp secure compiles using docker 18.09.1. It may start to become a problem on linux where we are running on 17.03.2-ce in production.
2019-01-24 12:30:37 +00:00
Christopher Hoskin
ddaa944aa3 Merge pull request #112 from sharelatex/csh-issue-1309-node-10.15.0
Upgrade to Node 10 - CLSI
2019-01-17 09:50:19 +00:00
Christopher Hoskin
a194d7ad05 Fix broken spacing 2019-01-16 15:12:23 +00:00
Christopher Hoskin
4c8b619ee8 Switch to node 10 2019-01-16 15:11:49 +00:00
Christopher Hoskin
4bd67d5e7e Merge pull request #111 from sharelatex/csh-issue-1338-bulk-upgrade
Services bulk upgrade - CLSI
2019-01-15 12:28:35 +00:00
Christopher Hoskin
c269c308ef Correctly pass command with arguments to runuser 2019-01-15 11:29:04 +00:00
Christopher Hoskin
e12ffdd535 Pass arguments to node, not to runuser 2019-01-15 11:12:21 +00:00
Christopher Hoskin
82afad7afc Add **/*.map to .gitignore 2019-01-11 12:11:36 +00:00
Christopher Hoskin
2fceac6ac8 Remove grunt 2019-01-11 12:06:45 +00:00
Christopher Hoskin
d4e9aca9e2 Bump buildscript to 1.1.11 2019-01-11 11:52:10 +00:00
Christopher Hoskin
5d2eb129e8 Init metrics at top of app.coffee 2019-01-11 10:19:47 +00:00
Christopher Hoskin
b52a8b2aa2 Bump logger to v1.5.9 and settings to v1.1.0 2019-01-11 10:18:37 +00:00
Henry Oswald
6fbdcd76d0 Merge pull request #110 from sharelatex/ho-increase-compile-size
pull clsi compile size limit into setting and bump to 7mb
2019-01-08 13:30:00 +00:00
Henry Oswald
541dac11cb pull clsi compile size limit into setting and bump to 7mb 2019-01-08 12:56:16 +00:00
Christopher Hoskin
ee7947f54d Merge pull request #107 from sharelatex/csh-issue-1309-node-6.15
Bump node to 6.15
2018-12-18 11:16:25 +00:00
Christopher Hoskin
984474ee11 Add npm-shrinkwrap.json 2018-12-18 11:03:06 +00:00
Christopher Hoskin
be855805c9 package-lock not supported until npm 5 2018-12-17 15:31:45 +00:00
Christopher Hoskin
2d023a3b03 Bump node to 6.15.1 2018-12-17 15:29:56 +00:00
Christopher Hoskin
1894e8ad5d Merge pull request #106 from sharelatex/csh-prom-metrics
Use promethus metrics
2018-12-14 10:21:40 +00:00
Christopher Hoskin
9507f0f80f Revert "Bump buildscript to 1.1.10"
This reverts commit 38874f9169.
2018-12-13 17:37:16 +00:00
Christopher Hoskin
19078fe866 Revert "Initialise metrics at begining of app"
This reverts commit 855f26c520.
2018-12-13 17:33:45 +00:00
Christopher Hoskin
38874f9169 Bump buildscript to 1.1.10 2018-12-13 14:45:40 +00:00
Christopher Hoskin
855f26c520 Initialise metrics at begining of app 2018-12-13 14:24:44 +00:00
Christopher Hoskin
8401bbdc26 Bump metrics-sharelatex to v2.0.12 2018-12-13 14:21:32 +00:00
Christopher Hoskin
71181243b3 Bump metrics-sharelatex.git to v2.0.11 2018-12-13 14:15:19 +00:00
Christopher Hoskin
0b4ae6ef8d Use metrics which labels host in timing 2018-12-11 12:11:53 +00:00
Christopher Hoskin
747c73fdad Merge pull request #105 from sharelatex/csh-204
Bump metrics to 2.0.4
2018-12-03 15:12:16 +00:00
Christopher Hoskin
1c1610a0bc Bump metrics to 2.0.4 2018-12-03 15:10:39 +00:00
Christopher Hoskin
434e819d23 Merge pull request #104 from sharelatex/csh-stackdriver
Add Prometheus Metrics to CLSIs
2018-12-03 11:45:02 +00:00
Christopher Hoskin
f92e626647 Inject routes after app defined 2018-11-29 15:49:12 +00:00
Christopher Hoskin
6159aff001 Inject metrics 2018-11-29 14:30:00 +00:00
Christopher Hoskin
49d5ad711a Bump metrics to v2.0.3 - specify tag correctly this time 2018-11-29 10:24:25 +00:00
Christopher Hoskin
bcdac34a0b Use v1.9.0 of metrics to get Prometheus support 2018-11-29 10:10:48 +00:00
Christopher Hoskin
25cb54d1d7 Merge branch 'master' into csh-stackdriver 2018-11-29 10:06:48 +00:00
Henry Oswald
75e77a3991 Merge pull request #103 from sharelatex/ho-mute-sentry-errors
have failed compiles warn rather than be an error
2018-11-28 22:35:51 +09:00
Henry Oswald
49f3b7d54f have failed compiles warn rather than be an error 2018-11-23 15:10:35 +00:00
Christopher Hoskin
f1ab938bab Merge pull request #102 from sharelatex/csh-expand-abbr
Expand CLSI to Common LaTeX Service Interface on first use
2018-11-22 09:52:30 +00:00
Christopher Hoskin
a18d49562c Expand CLSI to Common LaTeX Service Interface on first use 2018-11-22 09:13:23 +00:00
Christopher Hoskin
d3039a52f3 First attempt to use my stackdriver branch 2018-11-07 08:29:34 +00:00
Christopher Hoskin
7e07b8b4a7 Merge pull request #101 from sharelatex/csh-documentation
Add some notes on the CLSIs
2018-10-23 14:43:06 +01:00
Christopher Hoskin
473efdae70 Merge branch 'csh-documentation' of github.com:sharelatex/clsi-sharelatex into csh-documentation 2018-10-22 17:55:47 +01:00
Christopher Hoskin
3aa160b0e7 Make REAME more generic 2018-10-22 17:52:38 +01:00
Christopher Hoskin
114e4f7043 Fix indenting 2018-10-22 16:03:50 +01:00
Christopher Hoskin
cd0a71caba Add some notes on the CLSIs 2018-10-22 16:01:17 +01:00
Brian Gough
96d6fb3404 Merge pull request #100 from sharelatex/bg-create-main-file-for-pstool
use TikzManager to create main file for pstool package
2018-10-15 11:05:23 +01:00
Brian Gough
1481b4fe50 fix exception when content undefined in TikzManager 2018-10-15 10:01:52 +01:00
Brian Gough
3aad472a83 improve log message 2018-10-12 10:49:54 +01:00
Brian Gough
49ddcee0c6 use TikzManager to create main file for pstool package 2018-10-10 16:13:20 +01:00
Brian Gough
6d1545a40e Merge pull request #99 from sharelatex/bg-cache-tikz-minted-and-markdown-outputs
extend caching for tikz, minted and markdown files
2018-10-08 09:22:20 +01:00
Brian Gough
9ce7bfa8ab extend caching for tikz, minted and markdown files 2018-10-04 16:56:48 +01:00
Henry Oswald
7c4c8a9e44 remove debugging get settings function 2018-09-14 10:26:40 +01:00
Brian Gough
90436933da Merge pull request #96 from sharelatex/bg-cache-eps-to-pdf-converted-files
cache pdf files generated by epstopdf
2018-09-11 13:31:26 +01:00
Henry Oswald
77abf19f6b Merge pull request #86 from sharelatex/ho-dockerise
Dockerised clsi
2018-09-11 12:36:11 +01:00
Henry Oswald
a781c7f600 change timeout test latex code 2018-09-11 11:34:25 +01:00
Henry Oswald
b07b7a84be fix unit tests 2018-09-11 10:21:37 +01:00
Henry Oswald
58b4de905c Merge branch 'master' into ho-dockerise 2018-09-11 10:02:24 +01:00
Henry Oswald
5f9fb85613 bump wordcount timeouts, taken from 82b996b145 2018-09-11 09:55:10 +01:00
Henry Oswald
d3bb863d0a improve synctex logging 2018-09-11 09:51:20 +01:00
Brian Gough
00ebc87230 cache pdf files generated by epstopdf 2018-09-11 09:44:22 +01:00
Henry Oswald
6299832a13 don't error on a bad synctex call 2018-08-23 11:32:50 +01:00
Henry Oswald
607bb74ffa reduce log level 2018-08-23 11:16:28 +01:00
Henry Oswald
b4107b7391 fse.ensureDir when running synctex and wordcount 2018-08-23 08:34:18 +01:00
Henry Oswald
5074442702 fix unit tests 2018-08-23 00:21:05 +01:00
Henry Oswald
05ddbd3a18 try changing bin to be owned by node 2018-08-23 00:10:06 +01:00
Henry Oswald
7b773474d9 improve error reporting 2018-08-23 00:00:43 +01:00
Henry Oswald
e4d28addf9 change sync to async for lockfile debugging 2018-08-22 22:17:02 +01:00
Henry Oswald
171ad0329d fix sql query checking last access time 2018-08-22 18:21:15 +01:00
Henry Oswald
834eeffda4 add time secomp 2018-08-21 18:56:53 +01:00
Henry Oswald
0f179a7c7c add log on exited error code 2018-08-21 12:02:12 +01:00
Henry Oswald
1990f20dc0 improve error reporting 2018-08-20 10:12:32 +01:00
Henry Oswald
407c7c235b Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-08-19 11:46:11 +01:00
Henry Oswald
988f177f79 added loads of debugging 2018-08-19 11:38:27 +01:00
Christopher Hoskin
c6f49f04a9 Merge pull request #95 from sharelatex/csh-sentry
read sentry dsn from env var into config
2018-08-15 11:49:34 +01:00
Christopher Hoskin
a26d7093b4 Merge branch 'ho-dockerise' into csh-sentry 2018-08-15 09:44:02 +01:00
Henry Oswald
eec0529ef7 put FILESTORE_PARALLEL_FILE_DOWNLOADS and
FILESTORE_PARALLEL_SQL_QUERY_LIMIT into env vars
2018-08-14 15:17:56 +01:00
Christopher Hoskin
382f30f810 Revert "Put a guard on sentry dsn"
This reverts commit 95e052d059.
2018-08-13 17:36:53 +01:00
Christopher Hoskin
95e052d059 Put a guard on sentry dsn 2018-08-13 12:27:13 +01:00
Christopher Hoskin
9f79229835 Read sentry dsn from env 2018-08-03 15:33:53 +01:00
Henry Oswald
95b2e8caae comment out erroring log for moment 2018-08-01 14:32:17 +01:00
Henry Oswald
3890cdec37 null check host options 2018-08-01 14:10:22 +01:00
Henry Oswald
3e3468d9e9 reduce logging 2018-08-01 13:59:09 +01:00
Henry Oswald
9ef9a3b780 make Settings.parallelSqlQueryLimit a config setting 2018-07-31 14:38:24 +01:00
Henry Oswald
ee518c1755 fix expired projects command 2018-07-30 17:37:30 +01:00
Henry Oswald
3a9206f1e7 fix missing cb’s 2018-07-30 17:01:59 +01:00
Henry Oswald
d1ce49d6d7 add db queue file for global db query queues 2018-07-30 16:46:47 +01:00
Henry Oswald
627bed428e added a queue with 1 concurency to db queries 2018-07-30 16:22:04 +01:00
Henry Oswald
92e1240635 added some debugging 2018-07-30 15:18:25 +01:00
Henry Oswald
94a52333f7 add sync= off and read_uncommited=true to improve perf 2018-07-30 15:16:06 +01:00
Henry Oswald
c490479a1a remove some console.logs 2018-07-30 15:11:41 +01:00
Henry Oswald
f802717cb5 remove password from clsi for sql
sequalise fails when it is set to null
2018-07-30 14:04:33 +01:00
Henry Oswald
0eeee4284d bump retried and package versions 2018-07-30 11:25:28 +01:00
Henry Oswald
e1c23be845 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-07-26 16:52:26 +01:00
Henry Oswald
67d34fdaf0 dd wal logging 2018-07-26 16:12:26 +01:00
Christopher Hoskin
465dc31e75 Push images to overleaf-ops 2018-07-18 11:32:41 +01:00
Henry Oswald
2b6032b249 only set wal for sqlite 2018-07-17 12:53:07 +01:00
Henry Oswald
3478c28fa3 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-07-17 12:52:18 +01:00
Henry Oswald
3e26efe06f add PRAGMA journal_mode=WAL; 2018-07-17 12:50:33 +01:00
Christopher Hoskin
fb00098fc0 Bump build script to 1.1.8, drop csh-gcdm-test and csh-staging repos 2018-07-17 12:10:08 +01:00
Brian Gough
33092baf90 Merge branch 'master' of github.com:sharelatex/clsi-sharelatex 2018-07-17 10:41:14 +01:00
Brian Gough
4830e9f785 allow prune to fail to prevent build from terminating 2018-07-17 10:41:10 +01:00
Brian Gough
368f9b1c5d Merge pull request #91 from sharelatex/bg-increase-wordcount-timeout
increase timeout on wordcount
2018-07-17 10:10:36 +01:00
Henry Oswald
bcb87620b5 change override to leave image name so it works for wl_texlive 2018-07-16 17:25:14 +01:00
Henry Oswald
dd015a05cb remove express header 2018-07-16 15:38:23 +01:00
Henry Oswald
8d846f64a9 move texliveImageNameOveride further down request so it works for
compile tests
2018-07-13 11:52:49 +01:00
Henry Oswald
3545852173 quick hack to overright image name further down stack 2018-07-13 11:46:37 +01:00
Henry Oswald
7fc9412141 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-07-13 10:42:27 +01:00
Henry Oswald
a960614eb4 added texliveImageNameOveride 2018-07-13 10:37:22 +01:00
Christopher Hoskin
38bd598eb4 Merge pull request #94 from sharelatex/csh-remote-statsd
Depend on metrics v1.8.1 for remote StatsD host
2018-07-12 12:43:40 +01:00
Christopher Hoskin
97716365af Depend on metrics v1.8.1 for remote StatsD host 2018-07-12 11:22:02 +01:00
Christopher Hoskin
c1277e9f22 Use our experimental metrics 2018-07-06 15:08:38 +01:00
Henry Oswald
a75cec7d52 added maint down endpoint 2018-07-05 15:07:07 +01:00
Henry Oswald
6464aefdb4 added filestoreDomainOveride 2018-07-03 16:41:34 +01:00
Henry Oswald
ec85957ae4 add load balance http endpoints to shut box down 2018-06-28 16:04:34 +01:00
Henry Oswald
4bfc02ef3b fix seccomp key 2018-06-26 15:38:30 +01:00
Henry Oswald
364c8097c8 add error catch to settings.defaults 2018-06-26 15:04:56 +01:00
Henry Oswald
911e1d58f7 put seccomp_profile_path into variable and try catch 2018-06-26 14:44:03 +01:00
Henry Oswald
dd93d37460 added seccomp 2018-06-26 12:43:47 +01:00
Brian Gough
82b996b145 increase timeout on wordcount 2018-06-25 14:06:18 +01:00
Christopher Hoskin
b3033c1686 Add csh-staging to repos 2018-06-13 15:47:45 +01:00
Christopher Hoskin
547ef679b4 Merge pull request #89 from sharelatex/csh-issue-601
Csh issue 601
2018-06-13 15:45:17 +01:00
Henry Oswald
b30890ef99 remove the compile npm command, it isn't needed 2018-06-12 17:48:23 +01:00
Henry Oswald
926667f365 update build scripts so smoke tests are compiled 2018-06-12 17:44:13 +01:00
Christopher Hoskin
0a70985ba5 Specify repo correctly 2018-06-12 15:26:10 +01:00
Christopher Hoskin
4ca8027cb8 Increase acceptance test timeout. 2018-06-12 15:04:14 +01:00
Christopher Hoskin
da216c52e9 Accidently left warning message commented out :( 2018-06-12 11:17:26 +01:00
Christopher Hoskin
e6532b5681 Update build scripts from 1.1.3 to 1.1.6 2018-06-12 10:22:30 +01:00
Christopher Hoskin
85aec72206 Use metadata to determine Google Cloud project dynamically. Fixes: #601 2018-06-12 10:15:17 +01:00
Henry Oswald
f000ecb681 Merge branch 'master' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-06-08 19:21:18 +01:00
Henry Oswald
436f69f3a6 Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-05-25 15:33:08 +01:00
Henry Oswald
38e91ab3e4 bumped timeout to 30 seconds 2018-05-25 15:30:26 +01:00
henry oswald
0b3af7d759 change synctex binary and added it to mounted volumes in docker config 2018-05-25 13:45:07 +00:00
henry oswald
9548615169 all but the sync tests should pass 2018-05-25 12:43:12 +00:00
Henry Oswald
da814b0e3a log settings on startup 2018-05-25 12:01:16 +01:00
Henry Oswald
e544ad9a23 set user to tex for tests run on ci box 2018-05-25 11:51:34 +01:00
Henry Oswald
1814f1c997 added --exit to unit tests 2018-05-24 21:59:02 +01:00
Henry Oswald
98a4e60eb7 update to 1.1.3 build scripts 2018-05-24 19:03:57 +01:00
Henry Oswald
ca23cd42ad update package.json scripts 2018-04-09 11:06:35 +01:00
Henry Oswald
b330ee2d5b grep works with command
updated build scripts
acceptence tests break, files are written as root when user is node
2018-03-29 17:07:22 +01:00
Henry Oswald
b5a7eabaab update build script and add load balancer agent 2018-03-29 12:12:29 +01:00
Henry Oswald
ec75f9fa67 add smoke test env var 2018-03-20 13:48:12 +00:00
Henry Oswald
dc1ea9d3e9 ammend comment 2018-03-19 14:22:18 +00:00
Henry Oswald
4d955a8d41 try a build with node user 2018-03-19 14:10:45 +00:00
Henry Oswald
0915ac8c60 run as app user and chmod 777 compiles dir 2018-03-19 12:56:53 +00:00
Henry Oswald
aeb6f48945 try running as root 2018-03-19 09:51:26 +00:00
Henry Oswald
8ccbfc7d32 don't put synctex in as a volume 2018-03-16 18:11:46 +00:00
Henry Oswald
0bd9377018 chown synctex and add the creation of directories in 2018-03-16 17:48:55 +00:00
Henry Oswald
3c1d7ab264 mkdir the /app/bin/synctex-mount 2018-03-16 17:40:10 +00:00
Henry Oswald
3d9a93ad61 add logging of docker options 2018-03-16 17:37:36 +00:00
Henry Oswald
17c51c2ba0 added debugging and new moving commands 2018-03-16 17:30:11 +00:00
Henry Oswald
f4226ecd0e try copying synctex betwen directories 2018-03-16 17:10:56 +00:00
Henry Oswald
6fbfcfc68b move synctex into a directory for simple mounting 2018-03-16 16:50:30 +00:00
Henry Oswald
63145cc60c add synctex back in 2018-03-16 16:22:39 +00:00
Henry Oswald
5739a2aeca comment out synctex for moment 2018-03-16 16:04:26 +00:00
Henry Oswald
9f8a68be38 add log line for connecting to a db 2018-03-16 15:29:35 +00:00
Henry Oswald
1dce40c61f make compiles dir 2018-03-16 15:25:36 +00:00
Henry Oswald
52982b8fcd remove texlive docker images 2018-03-14 15:44:58 +00:00
Henry Oswald
a741a238a8 have entrypoint kickoff download off texlive images
install script exits without error if auth fails.
2018-03-14 15:44:58 +00:00
Henry Oswald
0c1b699bd5 add docker ignore rather than make clean 2018-03-14 15:44:58 +00:00
Henry Oswald
dc3cb439d0 update build scripts 2018-03-14 15:44:58 +00:00
Henry Oswald
83c7068bd1 test new scripts on ci 2018-03-14 15:44:58 +00:00
Henry Oswald
b9d94fb428 fixed commended tests 2018-03-14 15:44:58 +00:00
Henry Oswald
7dbed15fea update scripts from latest build scripts 1.1.0 2018-03-14 15:44:58 +00:00
Henry Oswald
3c4870f688 remove touch /var/run/docker.sock which doesn’t work robustly 2018-03-14 15:44:58 +00:00
Henry Oswald
4ff1121353 add cmd back in 2018-03-14 15:44:58 +00:00
Henry Oswald
aca9100c52 set entry point for dockerfile 2018-03-14 15:44:58 +00:00
Henry Oswald
96a237fb74 removed user temporarly, created make ci task 2018-03-14 15:44:58 +00:00
Henry Oswald
4e6514b17e add logging in db.coffee 2018-03-14 15:44:58 +00:00
Henry Oswald
00cf5468d0 update jenkins task 2018-03-14 15:44:58 +00:00
Henry Oswald
177c46df98 add cache dir 2018-03-14 15:44:58 +00:00
Henry Oswald
2f96350b7c removed unused scripts 2018-03-14 15:44:58 +00:00
Henry Oswald
f1df41112b wip for ci 2018-03-14 15:44:58 +00:00
Henry Oswald
b202af3cf2 added docker runner into core codebase
supports both local command runner and docker runner

added docker files for tex live

also fixed tests so they exit correctly & removed debug lines
2018-03-14 15:44:49 +00:00
Henry Oswald
3bdd50a231 fix url fetcher tests so they exit correctly 2018-03-05 10:39:46 +00:00
Henry Oswald
3134b8aada add SYNCTEX_BIN_HOST_PATH for ci 2018-03-03 13:40:29 +00:00
Henry Oswald
aa0f9ee0be Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-03-03 13:37:00 +00:00
Henry Oswald
4dd11f3442 update docker compose ci to use extension file and dockerfile 2018-03-03 13:36:42 +00:00
Henry Oswald
ae7357778e Merge branch 'ho-dockerise' of github.com:sharelatex/clsi-sharelatex into ho-dockerise 2018-03-02 18:31:09 +00:00
Henry Oswald
c6b962a8b9 Merge branch 'master' into ho-dockerise 2018-03-02 18:18:18 +00:00
Henry Oswald
3de14a3f17 Merge branch 'master' into ho-dockerise 2018-03-02 18:16:16 +00:00
Henry Oswald
49a35c5e11 Merge branch 'master' into ho-dockerise 2018-03-02 18:12:32 +00:00
Henry Oswald
b9874b5ae5 built with 1.1.0 scripts 2018-03-02 18:08:13 +00:00
Henry Oswald
5cb3bfcbbb uncomment tests 2018-03-02 17:59:37 +00:00
Henry Oswald
1a47887e80 make timeout latex more complex(slower) 2018-03-02 17:58:34 +00:00
Henry Oswald
70f016af1f unit tests pass, acceptence fail
uncomment tests
2018-03-02 17:34:41 +00:00
Henry Oswald
b8c22f4d74 wip, docker container is correctly created 2018-03-02 17:14:23 +00:00
Henry Oswald
8f6db5baff tests pass under app user 2018-03-02 17:14:23 +00:00
Henry Oswald
d698cc318f updateded build scripts 2018-03-02 17:14:23 +00:00
Henry Oswald
12b13d6199 mount app as volume in docker container for local tests
change to overrides
2018-03-02 17:14:23 +00:00
Henry Oswald
a02adacc98 updated build sripts with 1.0.3 2018-03-02 17:14:23 +00:00
Henry Oswald
a2a8b70b74 acceptence tests pass inside docker container (apart from sync) 2018-03-02 17:14:23 +00:00
Henry Oswald
017ba3a4ec mvp
needs hacked pacth in docker runner

wip

most tests pass
2018-03-02 17:14:20 +00:00
James Allen
b64106b730 Provide hosts and siblings container as environment settings and add npm run start script
wip acceptence tests run, but don't all pass

wip

removed npm-debug from git
2018-03-02 17:14:18 +00:00
68 changed files with 6255 additions and 458 deletions

9
.dockerignore Normal file
View File

@@ -0,0 +1,9 @@
node_modules/*
gitrev
.git
.gitignore
.npm
.nvmrc
nodemon.json
app.js
**/js/*

38
.github/ISSUE_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,38 @@
<!-- BUG REPORT TEMPLATE -->
## Steps to Reproduce
<!-- Describe the steps leading up to when / where you found the bug. -->
<!-- Screenshots may be helpful here. -->
1.
2.
3.
## Expected Behaviour
<!-- What should have happened when you completed the steps above? -->
## Observed Behaviour
<!-- What actually happened when you completed the steps above? -->
<!-- Screenshots may be helpful here. -->
## Context
<!-- How has this issue affected you? What were you trying to accomplish? -->
## Technical Info
<!-- Provide any technical details that may be applicable (or N/A if not applicable). -->
* URL:
* Browser Name and version:
* Operating System and version (desktop or mobile):
* Signed in as:
* Project and/or file:
## Analysis
<!--- Optionally, document investigation of / suggest a fix for the bug, e.g. 'comes from this line / commit' -->
## Who Needs to Know?
<!-- If you want to bring this to the attention of particular people, @-mention them below. -->
<!-- If a user reported this bug and should be notified when it is fixed, provide the Front conversation link. -->
-
-

45
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,45 @@
<!-- Please review https://github.com/overleaf/write_latex/blob/master/.github/CONTRIBUTING.md for guidance on what is expected in each section. -->
### Description
#### Screenshots
#### Related Issues / PRs
### Review
#### Potential Impact
#### Manual Testing Performed
- [ ]
- [ ]
#### Accessibility
### Deployment
#### Deployment Checklist
- [ ] Update documentation not included in the PR (if any)
- [ ]
#### Metrics and Monitoring
#### Who Needs to Know?

5
.gitignore vendored
View File

@@ -7,10 +7,13 @@ test/acceptance/js
test/acceptance/fixtures/tmp test/acceptance/fixtures/tmp
compiles compiles
app.js app.js
**/*.map
.DS_Store .DS_Store
*~ *~
cache cache
.vagrant .vagrant
db.sqlite db.sqlite
db.sqlite-wal
db.sqlite-shm
config/* config/*
bin/synctex npm-debug.log

2
.nvmrc
View File

@@ -1 +1 @@
6.11.2 10.15.0

35
.viminfo Normal file
View File

@@ -0,0 +1,35 @@
# This viminfo file was generated by Vim 7.4.
# You may edit it if you're careful!
# Value of 'encoding' when this file was written
*encoding=latin1
# hlsearch on (H) or off (h):
~h
# Command Line History (newest to oldest):
:x
# Search String History (newest to oldest):
# Expression History (newest to oldest):
# Input Line History (newest to oldest):
# Input Line History (newest to oldest):
# Registers:
# File marks:
'0 1 0 ~/hello
# Jumplist (newest first):
-' 1 0 ~/hello
# History of marks within files (newest to oldest):
> ~/hello
" 1 0
^ 1 1
. 1 0
+ 1 0

27
Dockerfile Normal file
View File

@@ -0,0 +1,27 @@
FROM node:10.15.0 as app
WORKDIR /app
#wildcard as some files may not be in all repos
COPY package*.json npm-shrink*.json /app/
RUN npm install --quiet
COPY . /app
RUN npm run compile:all
FROM node:10.15.0
RUN \
apt -y update && \
apt -y install moreutils
COPY --from=app /app /app
WORKDIR /app
RUN chmod 0755 ./install_deps.sh && ./install_deps.sh
ENTRYPOINT ["/bin/bash", "entrypoint.sh"]
CMD ["node", "--expose-gc", "app.js"]

View File

@@ -1,104 +0,0 @@
spawn = require("child_process").spawn
module.exports = (grunt) ->
grunt.initConfig
coffee:
app_src:
expand: true,
flatten: true,
cwd: "app"
src: ['coffee/*.coffee'],
dest: 'app/js/',
ext: '.js'
app:
src: "app.coffee"
dest: "app.js"
unit_tests:
expand: true
cwd: "test/unit/coffee"
src: ["**/*.coffee"]
dest: "test/unit/js/"
ext: ".js"
acceptance_tests:
expand: true
cwd: "test/acceptance/coffee"
src: ["**/*.coffee"]
dest: "test/acceptance/js/"
ext: ".js"
smoke_tests:
expand: true
cwd: "test/smoke/coffee"
src: ["**/*.coffee"]
dest: "test/smoke/js"
ext: ".js"
clean:
app: ["app/js/"]
unit_tests: ["test/unit/js"]
acceptance_tests: ["test/acceptance/js"]
smoke_tests: ["test/smoke/js"]
execute:
app:
src: "app.js"
mkdir:
all:
options:
create: ["cache", "compiles"]
mochaTest:
unit:
options:
reporter: "spec"
grep: grunt.option("grep")
src: ["test/unit/js/**/*.js"]
acceptance:
options:
reporter: "spec"
timeout: 40000
grep: grunt.option("grep")
src: ["test/acceptance/js/**/*.js"]
smoke:
options:
reported: "spec"
timeout: 10000
src: ["test/smoke/js/**/*.js"]
grunt.loadNpmTasks 'grunt-contrib-coffee'
grunt.loadNpmTasks 'grunt-contrib-clean'
grunt.loadNpmTasks 'grunt-mocha-test'
grunt.loadNpmTasks 'grunt-shell'
grunt.loadNpmTasks 'grunt-execute'
grunt.loadNpmTasks 'grunt-bunyan'
grunt.loadNpmTasks 'grunt-mkdir'
grunt.registerTask 'compile:bin', () ->
callback = @async()
proc = spawn "cc", [
"-o", "bin/synctex", "-Isrc/synctex",
"src/synctex.c", "src/synctex/synctex_parser.c", "src/synctex/synctex_parser_utils.c", "-lz"
], stdio: "inherit"
proc.on "close", callback
grunt.registerTask 'compile:app', ['clean:app', 'coffee:app', 'coffee:app_src', 'coffee:smoke_tests', 'compile:bin']
grunt.registerTask 'run', ['compile:app', 'bunyan', 'execute']
grunt.registerTask 'compile:unit_tests', ['clean:unit_tests', 'coffee:unit_tests']
grunt.registerTask 'test:unit', ['compile:app', 'compile:unit_tests', 'mochaTest:unit']
grunt.registerTask 'compile:acceptance_tests', ['clean:acceptance_tests', 'coffee:acceptance_tests']
grunt.registerTask 'test:acceptance', ['compile:acceptance_tests', 'mochaTest:acceptance']
grunt.registerTask 'compile:smoke_tests', ['clean:smoke_tests', 'coffee:smoke_tests']
grunt.registerTask 'test:smoke', ['compile:smoke_tests', 'mochaTest:smoke']
grunt.registerTask 'install', 'compile:app'
grunt.registerTask 'default', ['mkdir', 'run']

116
Jenkinsfile vendored
View File

@@ -1,79 +1,75 @@
pipeline { String cron_string = BRANCH_NAME == "master" ? "@daily" : ""
pipeline {
agent any agent any
environment {
GIT_PROJECT = "clsi"
JENKINS_WORKFLOW = "clsi-sharelatex"
TARGET_URL = "${env.JENKINS_URL}blue/organizations/jenkins/${JENKINS_WORKFLOW}/detail/$BRANCH_NAME/$BUILD_NUMBER/pipeline"
GIT_API_URL = "https://api.github.com/repos/overleaf/${GIT_PROJECT}/statuses/$GIT_COMMIT"
}
triggers { triggers {
pollSCM('* * * * *') pollSCM('* * * * *')
cron('@daily') cron(cron_string)
} }
stages { stages {
stage('Clean') {
steps {
// This is a terrible hack to set the file ownership to jenkins:jenkins so we can cleanup the directory
sh 'docker run -v $(pwd):/app --rm busybox /bin/chown -R 111:119 /app'
sh 'rm -fr node_modules'
}
}
stage('Install') { stage('Install') {
agent { steps {
docker { withCredentials([usernamePassword(credentialsId: 'GITHUB_INTEGRATION', usernameVariable: 'GH_AUTH_USERNAME', passwordVariable: 'GH_AUTH_PASSWORD')]) {
image 'node:6.11.2' sh "curl $GIT_API_URL \
args "-v /var/lib/jenkins/.npm:/tmp/.npm -e HOME=/tmp" --data '{ \
reuseNode true \"state\" : \"pending\", \
\"target_url\": \"$TARGET_URL\", \
\"description\": \"Your build is underway\", \
\"context\": \"ci/jenkins\" }' \
-u $GH_AUTH_USERNAME:$GH_AUTH_PASSWORD"
} }
} }
}
stage('Build') {
steps { steps {
sh 'git config --global core.logallrefupdates false' sh 'make build'
sh 'rm -fr node_modules'
checkout([$class: 'GitSCM', branches: [[name: '*/master']], extensions: [[$class: 'RelativeTargetDirectory', relativeTargetDir: '_docker-runner'], [$class: 'CloneOption', shallow: true]], userRemoteConfigs: [[credentialsId: 'GIT_DEPLOY_KEY', url: 'git@github.com:sharelatex/docker-runner-sharelatex']]])
sh 'npm install ./_docker-runner'
sh 'rm -fr ./_docker-runner ./_docker-runner@tmp'
sh 'npm install'
sh 'npm rebuild'
sh 'npm install --quiet grunt-cli'
} }
} }
stage('Compile and Test') {
agent { stage('Unit Tests') {
docker {
image 'node:6.11.2'
reuseNode true
}
}
steps { steps {
sh 'node_modules/.bin/grunt compile:app' sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make test_unit'
sh 'node_modules/.bin/grunt compile:acceptance_tests'
sh 'NODE_ENV=development node_modules/.bin/grunt test:unit'
} }
} }
stage('Acceptance Tests') { stage('Acceptance Tests') {
environment {
TEXLIVE_IMAGE="quay.io/sharelatex/texlive-full:2017.1"
}
steps { steps {
sh 'mkdir -p compiles cache' sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make test_acceptance'
// Not yet running, due to volumes/sibling containers
sh 'docker container prune -f'
sh 'docker pull $TEXLIVE_IMAGE'
sh 'docker pull sharelatex/acceptance-test-runner:clsi-6.11.2'
sh 'docker run --rm -e SIBLING_CONTAINER_USER=root -e SANDBOXED_COMPILES_HOST_DIR=$(pwd)/compiles -e SANDBOXED_COMPILES_SIBLING_CONTAINERS=true -e TEXLIVE_IMAGE=$TEXLIVE_IMAGE -v /var/run/docker.sock:/var/run/docker.sock -v $(pwd):/app sharelatex/acceptance-test-runner:clsi-6.11.2'
// This is a terrible hack to set the file ownership to jenkins:jenkins so we can cleanup the directory
sh 'docker run -v $(pwd):/app --rm busybox /bin/chown -R 111:119 /app'
sh 'rm -r compiles cache server.log db.sqlite config/settings.defaults.coffee'
} }
} }
stage('Package') {
stage('Package and docker push') {
steps { steps {
sh 'echo ${BUILD_NUMBER} > build_number.txt' sh 'echo ${BUILD_NUMBER} > build_number.txt'
sh 'touch build.tar.gz' // Avoid tar warning about files changing during read sh 'touch build.tar.gz' // Avoid tar warning about files changing during read
sh 'tar -czf build.tar.gz --exclude=build.tar.gz --exclude-vcs .' sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make tar'
withCredentials([file(credentialsId: 'gcr.io_overleaf-ops', variable: 'DOCKER_REPO_KEY_PATH')]) {
sh 'docker login -u _json_key --password-stdin https://gcr.io/overleaf-ops < ${DOCKER_REPO_KEY_PATH}'
}
sh 'DOCKER_REPO=gcr.io/overleaf-ops make publish'
sh 'docker logout https://gcr.io/overleaf-ops'
} }
} }
stage('Publish') {
stage('Publish to s3') {
steps { steps {
sh 'echo ${BRANCH_NAME}-${BUILD_NUMBER} > build_number.txt'
withAWS(credentials:'S3_CI_BUILDS_AWS_KEYS', region:"${S3_REGION_BUILD_ARTEFACTS}") { withAWS(credentials:'S3_CI_BUILDS_AWS_KEYS', region:"${S3_REGION_BUILD_ARTEFACTS}") {
s3Upload(file:'build.tar.gz', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/${BUILD_NUMBER}.tar.gz") s3Upload(file:'build.tar.gz', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/${BUILD_NUMBER}.tar.gz")
}
withAWS(credentials:'S3_CI_BUILDS_AWS_KEYS', region:"${S3_REGION_BUILD_ARTEFACTS}") {
// The deployment process uses this file to figure out the latest build // The deployment process uses this file to figure out the latest build
s3Upload(file:'build_number.txt', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/latest") s3Upload(file:'build_number.txt', bucket:"${S3_BUCKET_BUILD_ARTEFACTS}", path:"${JOB_NAME}/latest")
} }
@@ -82,11 +78,37 @@ pipeline {
} }
post { post {
always {
sh 'DOCKER_COMPOSE_FLAGS="-f docker-compose.ci.yml" make test_clean'
sh 'make clean'
}
success {
withCredentials([usernamePassword(credentialsId: 'GITHUB_INTEGRATION', usernameVariable: 'GH_AUTH_USERNAME', passwordVariable: 'GH_AUTH_PASSWORD')]) {
sh "curl $GIT_API_URL \
--data '{ \
\"state\" : \"success\", \
\"target_url\": \"$TARGET_URL\", \
\"description\": \"Your build succeeded!\", \
\"context\": \"ci/jenkins\" }' \
-u $GH_AUTH_USERNAME:$GH_AUTH_PASSWORD"
}
}
failure { failure {
mail(from: "${EMAIL_ALERT_FROM}", mail(from: "${EMAIL_ALERT_FROM}",
to: "${EMAIL_ALERT_TO}", to: "${EMAIL_ALERT_TO}",
subject: "Jenkins build failed: ${JOB_NAME}:${BUILD_NUMBER}", subject: "Jenkins build failed: ${JOB_NAME}:${BUILD_NUMBER}",
body: "Build: ${BUILD_URL}") body: "Build: ${BUILD_URL}")
withCredentials([usernamePassword(credentialsId: 'GITHUB_INTEGRATION', usernameVariable: 'GH_AUTH_USERNAME', passwordVariable: 'GH_AUTH_PASSWORD')]) {
sh "curl $GIT_API_URL \
--data '{ \
\"state\" : \"failure\", \
\"target_url\": \"$TARGET_URL\", \
\"description\": \"Your build failed\", \
\"context\": \"ci/jenkins\" }' \
-u $GH_AUTH_USERNAME:$GH_AUTH_PASSWORD"
}
} }
} }

51
Makefile Normal file
View File

@@ -0,0 +1,51 @@
# This file was auto-generated, do not edit it directly.
# Instead run bin/update_build_scripts from
# https://github.com/sharelatex/sharelatex-dev-environment
# Version: 1.1.22
BUILD_NUMBER ?= local
BRANCH_NAME ?= $(shell git rev-parse --abbrev-ref HEAD)
PROJECT_NAME = clsi
DOCKER_COMPOSE_FLAGS ?= -f docker-compose.yml
DOCKER_COMPOSE := BUILD_NUMBER=$(BUILD_NUMBER) \
BRANCH_NAME=$(BRANCH_NAME) \
PROJECT_NAME=$(PROJECT_NAME) \
MOCHA_GREP=${MOCHA_GREP} \
docker-compose ${DOCKER_COMPOSE_FLAGS}
clean:
docker rmi ci/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
docker rmi gcr.io/overleaf-ops/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
rm -f app.js
rm -rf app/js
rm -rf test/unit/js
rm -rf test/acceptance/js
test: test_unit test_acceptance
test_unit:
@[ ! -d test/unit ] && echo "clsi has no unit tests" || $(DOCKER_COMPOSE) run --rm test_unit
test_acceptance: test_clean test_acceptance_pre_run test_acceptance_run
test_acceptance_run:
@[ ! -d test/acceptance ] && echo "clsi has no acceptance tests" || $(DOCKER_COMPOSE) run --rm test_acceptance
test_clean:
$(DOCKER_COMPOSE) down -v -t 0
test_acceptance_pre_run:
@[ ! -f test/acceptance/js/scripts/pre-run ] && echo "clsi has no pre acceptance tests task" || $(DOCKER_COMPOSE) run --rm test_acceptance test/acceptance/js/scripts/pre-run
build:
docker build --pull --tag ci/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER) \
--tag gcr.io/overleaf-ops/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER) \
.
tar:
$(DOCKER_COMPOSE) up tar
publish:
docker push $(DOCKER_REPO)/$(PROJECT_NAME):$(BRANCH_NAME)-$(BUILD_NUMBER)
.PHONY: clean test test_unit test_acceptance test_clean build publish

View File

@@ -1,16 +1,38 @@
clsi-sharelatex overleaf/clsi
=============== ===============
A web api for compiling LaTeX documents in the cloud A web api for compiling LaTeX documents in the cloud
[![Build Status](https://travis-ci.org/sharelatex/clsi-sharelatex.png?branch=master)](https://travis-ci.org/sharelatex/clsi-sharelatex) The Common LaTeX Service Interface (CLSI) provides a RESTful interface to traditional LaTeX tools (or, more generally, any command line tool for composing marked-up documents into a display format such as PDF or HTML). The CLSI listens on the following ports by default:
* TCP/3009 - the RESTful interface
* TCP/3048 - reports load information
* TCP/3049 - HTTP interface to control the CLSI service
These defaults can be modified in `config/settings.defaults.coffee`.
The provided `Dockerfile` builds a docker image which has the docker command line tools installed. The configuration in `docker-compose-config.yml` mounts the docker socket, in order that the CLSI container can talk to the docker host it is running in. This allows it to spin up `sibling containers` running an image with a TeX distribution installed to perform the actual compiles.
The CLSI can be configured through the following environment variables:
* `DOCKER_RUNNER` - Set to true to use sibling containers
* `SYNCTEX_BIN_HOST_PATH` - Path to SyncTeX binary
* `COMPILES_HOST_DIR` - Working directory for LaTeX compiles
* `SQLITE_PATH` - Path to SQLite database
* `TEXLIVE_IMAGE` - The TEXLIVE docker image to use for sibling containers, e.g. `gcr.io/overleaf-ops/texlive-full:2017.1`
* `TEXLIVE_IMAGE_USER` - When using sibling containers, the user to run as in the TEXLIVE image. Defaults to `tex`
* `TEX_LIVE_IMAGE_NAME_OVERRIDE` - The name of the registry for the docker image e.g. `gcr.io/overleaf-ops`
* `FILESTORE_DOMAIN_OVERRIDE` - The url for the filestore service e.g.`http://$FILESTORE_HOST:3009`
* `STATSD_HOST` - The address of the Statsd service (used by the metrics module)
* `LISTEN_ADDRESS` - The address for the RESTful service to listen on. Set to `0.0.0.0` to listen on all network interfaces
* `SMOKE_TEST` - Whether to run smoke tests
Installation Installation
------------ ------------
The CLSI can be installed and set up as part of the entire [ShareLaTeX stack](https://github.com/sharelatex/sharelatex) (complete with front end editor and document storage), or it can be run as a standalone service. To run is as a standalone service, first checkout this repository: The CLSI can be installed and set up as part of the entire [Overleaf stack](https://github.com/overleaf/overleaf) (complete with front end editor and document storage), or it can be run as a standalone service. To run is as a standalone service, first checkout this repository:
$ git clone git@github.com:sharelatex/clsi-sharelatex.git $ git clone git@github.com:overleaf/clsi.git
Then install the require npm modules: Then install the require npm modules:
@@ -92,4 +114,4 @@ License
The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3. A copy can be found in the `LICENSE` file. The code in this repository is released under the GNU AFFERO GENERAL PUBLIC LICENSE, version 3. A copy can be found in the `LICENSE` file.
Copyright (c) ShareLaTeX, 2014. Copyright (c) Overleaf, 2014-2019.

View File

@@ -1,3 +1,6 @@
Metrics = require "metrics-sharelatex"
Metrics.initialize("clsi")
CompileController = require "./app/js/CompileController" CompileController = require "./app/js/CompileController"
Settings = require "settings-sharelatex" Settings = require "settings-sharelatex"
logger = require "logger-sharelatex" logger = require "logger-sharelatex"
@@ -12,8 +15,7 @@ Errors = require './app/js/Errors'
Path = require "path" Path = require "path"
fs = require "fs" fs = require "fs"
Metrics = require "metrics-sharelatex"
Metrics.initialize("clsi")
Metrics.open_sockets.monitor(logger) Metrics.open_sockets.monitor(logger)
Metrics.memory.monitor(logger) Metrics.memory.monitor(logger)
@@ -26,15 +28,17 @@ express = require "express"
bodyParser = require "body-parser" bodyParser = require "body-parser"
app = express() app = express()
Metrics.injectMetricsRoute(app)
app.use Metrics.http.monitor(logger) app.use Metrics.http.monitor(logger)
# Compile requests can take longer than the default two # Compile requests can take longer than the default two
# minutes (including file download time), so bump up the # minutes (including file download time), so bump up the
# timeout a bit. # timeout a bit.
TIMEOUT = 6 * 60 * 1000 TIMEOUT = 10 * 60 * 1000
app.use (req, res, next) -> app.use (req, res, next) ->
req.setTimeout TIMEOUT req.setTimeout TIMEOUT
res.setTimeout TIMEOUT res.setTimeout TIMEOUT
res.removeHeader("X-Powered-By")
next() next()
app.param 'project_id', (req, res, next, project_id) -> app.param 'project_id', (req, res, next, project_id) ->
@@ -56,7 +60,7 @@ app.param 'build_id', (req, res, next, build_id) ->
next new Error("invalid build id #{build_id}") next new Error("invalid build id #{build_id}")
app.post "/project/:project_id/compile", bodyParser.json(limit: "5mb"), CompileController.compile app.post "/project/:project_id/compile", bodyParser.json(limit: Settings.compileSizeLimit), CompileController.compile
app.post "/project/:project_id/compile/stop", CompileController.stopCompile app.post "/project/:project_id/compile/stop", CompileController.stopCompile
app.delete "/project/:project_id", CompileController.clearCache app.delete "/project/:project_id", CompileController.clearCache
@@ -66,7 +70,7 @@ app.get "/project/:project_id/wordcount", CompileController.wordcount
app.get "/project/:project_id/status", CompileController.status app.get "/project/:project_id/status", CompileController.status
# Per-user containers # Per-user containers
app.post "/project/:project_id/user/:user_id/compile", bodyParser.json(limit: "5mb"), CompileController.compile app.post "/project/:project_id/user/:user_id/compile", bodyParser.json(limit: Settings.compileSizeLimit), CompileController.compile
app.post "/project/:project_id/user/:user_id/compile/stop", CompileController.stopCompile app.post "/project/:project_id/user/:user_id/compile/stop", CompileController.stopCompile
app.delete "/project/:project_id/user/:user_id", CompileController.clearCache app.delete "/project/:project_id/user/:user_id", CompileController.clearCache
@@ -139,7 +143,10 @@ app.get "/health_check", (req, res)->
res.contentType(resCacher?.setContentType) res.contentType(resCacher?.setContentType)
res.status(resCacher?.code).send(resCacher?.body) res.status(resCacher?.code).send(resCacher?.body)
profiler = require "v8-profiler" app.get "/smoke_test_force", (req, res)->
smokeTest.run(require.resolve(__dirname + "/test/smoke/js/SmokeTests.js"))(req, res)
profiler = require "v8-profiler-node8"
app.get "/profile", (req, res) -> app.get "/profile", (req, res) ->
time = parseInt(req.query.time || "1000") time = parseInt(req.query.time || "1000")
profiler.startProfiling("test") profiler.startProfiling("test")
@@ -160,8 +167,76 @@ app.use (error, req, res, next) ->
logger.error {err: error, url: req.url}, "server error" logger.error {err: error, url: req.url}, "server error"
res.sendStatus(error?.statusCode || 500) res.sendStatus(error?.statusCode || 500)
app.listen port = (Settings.internal?.clsi?.port or 3013), host = (Settings.internal?.clsi?.host or "localhost"), (error) -> net = require "net"
logger.info "CLSI starting up, listening on #{host}:#{port}" os = require "os"
STATE = "up"
loadTcpServer = net.createServer (socket) ->
socket.on "error", (err)->
if err.code == "ECONNRESET"
# this always comes up, we don't know why
return
logger.err err:err, "error with socket on load check"
socket.destroy()
if STATE == "up" and Settings.internal.load_balancer_agent.report_load
currentLoad = os.loadavg()[0]
# staging clis's have 1 cpu core only
if os.cpus().length == 1
availableWorkingCpus = 1
else
availableWorkingCpus = os.cpus().length - 1
freeLoad = availableWorkingCpus - currentLoad
freeLoadPercentage = Math.round((freeLoad / availableWorkingCpus) * 100)
if freeLoadPercentage <= 0
freeLoadPercentage = 1 # when its 0 the server is set to drain and will move projects to different servers
socket.write("up, #{freeLoadPercentage}%\n", "ASCII")
socket.end()
else
socket.write("#{STATE}\n", "ASCII")
socket.end()
loadHttpServer = express()
loadHttpServer.post "/state/up", (req, res, next) ->
STATE = "up"
logger.info "getting message to set server to down"
res.sendStatus 204
loadHttpServer.post "/state/down", (req, res, next) ->
STATE = "down"
logger.info "getting message to set server to down"
res.sendStatus 204
loadHttpServer.post "/state/maint", (req, res, next) ->
STATE = "maint"
logger.info "getting message to set server to maint"
res.sendStatus 204
port = (Settings.internal?.clsi?.port or 3013)
host = (Settings.internal?.clsi?.host or "localhost")
load_tcp_port = Settings.internal.load_balancer_agent.load_port
load_http_port = Settings.internal.load_balancer_agent.local_port
if !module.parent # Called directly
app.listen port, host, (error) ->
logger.info "CLSI starting up, listening on #{host}:#{port}"
loadTcpServer.listen load_tcp_port, host, (error) ->
throw error if error?
logger.info "Load tcp agent listening on load port #{load_tcp_port}"
loadHttpServer.listen load_http_port, host, (error) ->
throw error if error?
logger.info "Load http agent listening on load port #{load_http_port}"
module.exports = app
setInterval () -> setInterval () ->
ProjectPersistenceManager.clearExpiredProjects() ProjectPersistenceManager.clearExpiredProjects()

View File

@@ -1,44 +1,11 @@
spawn = require("child_process").spawn Settings = require "settings-sharelatex"
logger = require "logger-sharelatex" logger = require "logger-sharelatex"
logger.info "using standard command runner" if Settings.clsi?.dockerRunner == true
commandRunnerPath = "./DockerRunner"
else
commandRunnerPath = "./LocalCommandRunner"
logger.info commandRunnerPath:commandRunnerPath, "selecting command runner for clsi"
CommandRunner = require(commandRunnerPath)
module.exports = CommandRunner = module.exports = CommandRunner
run: (project_id, command, directory, image, timeout, environment, callback = (error) ->) ->
command = (arg.replace('$COMPILE_DIR', directory) for arg in command)
logger.log project_id: project_id, command: command, directory: directory, "running command"
logger.warn "timeouts and sandboxing are not enabled with CommandRunner"
# merge environment settings
env = {}
env[key] = value for key, value of process.env
env[key] = value for key, value of environment
# run command as detached process so it has its own process group (which can be killed if needed)
proc = spawn command[0], command.slice(1), stdio: "inherit", cwd: directory, detached: true, env: env
proc.on "error", (err)->
logger.err err:err, project_id:project_id, command: command, directory: directory, "error running command"
callback(err)
proc.on "close", (code, signal) ->
logger.info code:code, signal:signal, project_id:project_id, "command exited"
if signal is 'SIGTERM' # signal from kill method below
err = new Error("terminated")
err.terminated = true
return callback(err)
else if code is 1 # exit status from chktex
err = new Error("exited")
err.code = code
return callback(err)
else
callback()
return proc.pid # return process id to allow job to be killed if necessary
kill: (pid, callback = (error) ->) ->
try
process.kill -pid # kill all processes in group
catch err
return callback(err)
callback()

View File

@@ -33,14 +33,17 @@ module.exports = CompileController =
else else
status = "error" status = "error"
code = 500 code = 500
logger.error err: error, project_id: request.project_id, "error running compile" logger.warn err: error, project_id: request.project_id, "error running compile"
else else
status = "failure" status = "failure"
for file in outputFiles for file in outputFiles
if file.path?.match(/output\.pdf$/) if file.path?.match(/output\.pdf$/)
status = "success" status = "success"
if file.path?.match(/output\.html$/)
status = "success" if status == "failure"
logger.warn project_id: request.project_id, outputFiles:outputFiles, "project failed to compile successfully, no output.pdf generated"
# log an error if any core files are found # log an error if any core files are found
for file in outputFiles for file in outputFiles
if file.path is "core" if file.path is "core"
@@ -50,7 +53,7 @@ module.exports = CompileController =
res.status(code or 200).send { res.status(code or 200).send {
compile: compile:
status: status status: status
error: error?.message or error error: error?.message or error
outputFiles: outputFiles.map (file) -> outputFiles: outputFiles.map (file) ->
url: url:
"#{Settings.apis.clsi.url}/project/#{request.project_id}" + "#{Settings.apis.clsi.url}/project/#{request.project_id}" +
@@ -79,10 +82,9 @@ module.exports = CompileController =
column = parseInt(req.query.column, 10) column = parseInt(req.query.column, 10)
project_id = req.params.project_id project_id = req.params.project_id
user_id = req.params.user_id user_id = req.params.user_id
CompileManager.syncFromCode project_id, user_id, file, line, column, (error, pdfPositions) -> CompileManager.syncFromCode project_id, user_id, file, line, column, (error, pdfPositions) ->
return next(error) if error? return next(error) if error?
res.send JSON.stringify { res.json {
pdf: pdfPositions pdf: pdfPositions
} }
@@ -92,10 +94,9 @@ module.exports = CompileController =
v = parseFloat(req.query.v) v = parseFloat(req.query.v)
project_id = req.params.project_id project_id = req.params.project_id
user_id = req.params.user_id user_id = req.params.user_id
CompileManager.syncFromPdf project_id, user_id, page, h, v, (error, codePositions) -> CompileManager.syncFromPdf project_id, user_id, page, h, v, (error, codePositions) ->
return next(error) if error? return next(error) if error?
res.send JSON.stringify { res.json {
code: codePositions code: codePositions
} }
@@ -108,7 +109,7 @@ module.exports = CompileController =
CompileManager.wordcount project_id, user_id, file, image, (error, result) -> CompileManager.wordcount project_id, user_id, file, image, (error, result) ->
return next(error) if error? return next(error) if error?
res.send JSON.stringify { res.json {
texcount: result texcount: result
} }

View File

@@ -15,10 +15,7 @@ fse = require "fs-extra"
os = require("os") os = require("os")
async = require "async" async = require "async"
Errors = require './Errors' Errors = require './Errors'
CommandRunner = require "./CommandRunner"
commandRunner = Settings.clsi?.commandRunner or "./CommandRunner"
logger.info commandRunner:commandRunner, "selecting command runner for clsi"
CommandRunner = require(commandRunner)
getCompileName = (project_id, user_id) -> getCompileName = (project_id, user_id) ->
if user_id? then "#{project_id}-#{user_id}" else project_id if user_id? then "#{project_id}-#{user_id}" else project_id
@@ -41,7 +38,6 @@ module.exports = CompileManager =
doCompile: (request, callback = (error, outputFiles) ->) -> doCompile: (request, callback = (error, outputFiles) ->) ->
compileDir = getCompileDir(request.project_id, request.user_id) compileDir = getCompileDir(request.project_id, request.user_id)
timer = new Metrics.Timer("write-to-disk") timer = new Metrics.Timer("write-to-disk")
logger.log project_id: request.project_id, user_id: request.user_id, "syncing resources to disk" logger.log project_id: request.project_id, user_id: request.user_id, "syncing resources to disk"
ResourceWriter.syncResourcesToDisk request, compileDir, (error, resourceList) -> ResourceWriter.syncResourcesToDisk request, compileDir, (error, resourceList) ->
@@ -62,9 +58,9 @@ module.exports = CompileManager =
callback() callback()
createTikzFileIfRequired = (callback) -> createTikzFileIfRequired = (callback) ->
TikzManager.checkMainFile compileDir, request.rootResourcePath, resourceList, (error, usesTikzExternalize) -> TikzManager.checkMainFile compileDir, request.rootResourcePath, resourceList, (error, needsMainFile) ->
return callback(error) if error? return callback(error) if error?
if usesTikzExternalize if needsMainFile
TikzManager.injectOutputFile compileDir, request.rootResourcePath, callback TikzManager.injectOutputFile compileDir, request.rootResourcePath, callback
else else
callback() callback()
@@ -97,6 +93,7 @@ module.exports = CompileManager =
compiler: request.compiler compiler: request.compiler
timeout: request.timeout timeout: request.timeout
image: request.imageName image: request.imageName
flags: request.flags
environment: env environment: env
}, (error, output, stats, timings) -> }, (error, output, stats, timings) ->
# request was for validation only # request was for validation only
@@ -134,7 +131,7 @@ module.exports = CompileManager =
return callback(error) if error? return callback(error) if error?
OutputCacheManager.saveOutputFiles outputFiles, compileDir, (error, newOutputFiles) -> OutputCacheManager.saveOutputFiles outputFiles, compileDir, (error, newOutputFiles) ->
callback null, newOutputFiles callback null, newOutputFiles
stopCompile: (project_id, user_id, callback = (error) ->) -> stopCompile: (project_id, user_id, callback = (error) ->) ->
compileName = getCompileName(project_id, user_id) compileName = getCompileName(project_id, user_id)
LatexRunner.killLatex compileName, callback LatexRunner.killLatex compileName, callback
@@ -205,21 +202,31 @@ module.exports = CompileManager =
base_dir = Settings.path.synctexBaseDir(compileName) base_dir = Settings.path.synctexBaseDir(compileName)
file_path = base_dir + "/" + file_name file_path = base_dir + "/" + file_name
compileDir = getCompileDir(project_id, user_id) compileDir = getCompileDir(project_id, user_id)
synctex_path = Path.join(compileDir, "output.pdf") synctex_path = "#{base_dir}/output.pdf"
CompileManager._runSynctex ["code", synctex_path, file_path, line, column], (error, stdout) -> command = ["code", synctex_path, file_path, line, column]
return callback(error) if error? fse.ensureDir compileDir, (error) ->
logger.log project_id: project_id, user_id:user_id, file_name: file_name, line: line, column: column, stdout: stdout, "synctex code output" if error?
callback null, CompileManager._parseSynctexFromCodeOutput(stdout) logger.err {error, project_id, user_id, file_name}, "error ensuring dir for sync from code"
return callback(error)
CompileManager._runSynctex project_id, user_id, command, (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, user_id:user_id, file_name: file_name, line: line, column: column, command:command, stdout: stdout, "synctex code output"
callback null, CompileManager._parseSynctexFromCodeOutput(stdout)
syncFromPdf: (project_id, user_id, page, h, v, callback = (error, filePositions) ->) -> syncFromPdf: (project_id, user_id, page, h, v, callback = (error, filePositions) ->) ->
compileName = getCompileName(project_id, user_id) compileName = getCompileName(project_id, user_id)
base_dir = Settings.path.synctexBaseDir(compileName)
compileDir = getCompileDir(project_id, user_id) compileDir = getCompileDir(project_id, user_id)
synctex_path = Path.join(compileDir, "output.pdf") base_dir = Settings.path.synctexBaseDir(compileName)
CompileManager._runSynctex ["pdf", synctex_path, page, h, v], (error, stdout) -> synctex_path = "#{base_dir}/output.pdf"
return callback(error) if error? command = ["pdf", synctex_path, page, h, v]
logger.log project_id: project_id, user_id:user_id, page: page, h: h, v:v, stdout: stdout, "synctex pdf output" fse.ensureDir compileDir, (error) ->
callback null, CompileManager._parseSynctexFromPdfOutput(stdout, base_dir) if error?
logger.err {error, project_id, user_id, file_name}, "error ensuring dir for sync to code"
return callback(error)
CompileManager._runSynctex project_id, user_id, command, (error, stdout) ->
return callback(error) if error?
logger.log project_id: project_id, user_id:user_id, page: page, h: h, v:v, stdout: stdout, "synctex pdf output"
callback null, CompileManager._parseSynctexFromPdfOutput(stdout, base_dir)
_checkFileExists: (path, callback = (error) ->) -> _checkFileExists: (path, callback = (error) ->) ->
synctexDir = Path.dirname(path) synctexDir = Path.dirname(path)
@@ -235,19 +242,19 @@ module.exports = CompileManager =
return callback(new Error("not a file")) if not stats?.isFile() return callback(new Error("not a file")) if not stats?.isFile()
callback() callback()
_runSynctex: (args, callback = (error, stdout) ->) -> _runSynctex: (project_id, user_id, command, callback = (error, stdout) ->) ->
bin_path = Path.resolve(__dirname + "/../../bin/synctex")
seconds = 1000 seconds = 1000
outputFilePath = args[1]
CompileManager._checkFileExists outputFilePath, (error) -> command.unshift("/opt/synctex")
return callback(error) if error?
if Settings.clsi?.synctexCommandWrapper? directory = getCompileDir(project_id, user_id)
[bin_path, args] = Settings.clsi?.synctexCommandWrapper bin_path, args timeout = 60 * 1000 # increased to allow for large projects
child_process.execFile bin_path, args, timeout: 10 * seconds, (error, stdout, stderr) -> compileName = getCompileName(project_id, user_id)
if error? CommandRunner.run compileName, command, directory, Settings.clsi?.docker.image, timeout, {}, (error, output) ->
logger.err err:error, args:args, "error running synctex" if error?
return callback(error) logger.err err:error, command:command, project_id:project_id, user_id:user_id, "error running synctex"
callback(null, stdout) return callback(error)
callback(null, output.stdout)
_parseSynctexFromCodeOutput: (output) -> _parseSynctexFromCodeOutput: (output) ->
results = [] results = []
@@ -276,23 +283,28 @@ module.exports = CompileManager =
} }
return results return results
wordcount: (project_id, user_id, file_name, image, callback = (error, pdfPositions) ->) -> wordcount: (project_id, user_id, file_name, image, callback = (error, pdfPositions) ->) ->
logger.log project_id:project_id, user_id:user_id, file_name:file_name, image:image, "running wordcount" logger.log project_id:project_id, user_id:user_id, file_name:file_name, image:image, "running wordcount"
file_path = "$COMPILE_DIR/" + file_name file_path = "$COMPILE_DIR/" + file_name
command = [ "texcount", '-nocol', '-inc', file_path, "-out=" + file_path + ".wc"] command = [ "texcount", '-nocol', '-inc', file_path, "-out=" + file_path + ".wc"]
directory = getCompileDir(project_id, user_id) compileDir = getCompileDir(project_id, user_id)
timeout = 10 * 1000 timeout = 60 * 1000
compileName = getCompileName(project_id, user_id) compileName = getCompileName(project_id, user_id)
fse.ensureDir compileDir, (error) ->
CommandRunner.run compileName, command, directory, image, timeout, {}, (error) -> if error?
return callback(error) if error? logger.err {error, project_id, user_id, file_name}, "error ensuring dir for sync from code"
fs.readFile directory + "/" + file_name + ".wc", "utf-8", (err, stdout) -> return callback(error)
if err? CommandRunner.run compileName, command, compileDir, image, timeout, {}, (error) ->
logger.err err:err, command:command, directory:directory, project_id:project_id, user_id:user_id, "error reading word count output" return callback(error) if error?
return callback(err) fs.readFile compileDir + "/" + file_name + ".wc", "utf-8", (err, stdout) ->
results = CompileManager._parseWordcountFromOutput(stdout) if err?
logger.log project_id:project_id, user_id:user_id, wordcount: results, "word count results" #call it node_err so sentry doesn't use random path error as unique id so it can't be ignored
callback null, results logger.err node_err:err, command:command, compileDir:compileDir, project_id:project_id, user_id:user_id, "error reading word count output"
return callback(err)
results = CompileManager._parseWordcountFromOutput(stdout)
logger.log project_id:project_id, user_id:user_id, wordcount: results, "word count results"
callback null, results
_parseWordcountFromOutput: (output) -> _parseWordcountFromOutput: (output) ->
results = { results = {

13
app/coffee/DbQueue.coffee Normal file
View File

@@ -0,0 +1,13 @@
async = require "async"
Settings = require "settings-sharelatex"
logger = require("logger-sharelatex")
queue = async.queue((task, cb)->
task(cb)
, Settings.parallelSqlQueryLimit)
queue.drain = ()->
logger.debug('all items have been processed')
module.exports =
queue: queue

View File

@@ -0,0 +1,56 @@
logger = require "logger-sharelatex"
LockState = {} # locks for docker container operations, by container name
module.exports = LockManager =
MAX_LOCK_HOLD_TIME: 15000 # how long we can keep a lock
MAX_LOCK_WAIT_TIME: 10000 # how long we wait for a lock
LOCK_TEST_INTERVAL: 1000 # retry time
tryLock: (key, callback = (err, gotLock) ->) ->
existingLock = LockState[key]
if existingLock? # the lock is already taken, check how old it is
lockAge = Date.now() - existingLock.created
if lockAge < LockManager.MAX_LOCK_HOLD_TIME
return callback(null, false) # we didn't get the lock, bail out
else
logger.error {key: key, lock: existingLock, age:lockAge}, "taking old lock by force"
# take the lock
LockState[key] = lockValue = {created: Date.now()}
callback(null, true, lockValue)
getLock: (key, callback = (error, lockValue) ->) ->
startTime = Date.now()
do attempt = () ->
LockManager.tryLock key, (error, gotLock, lockValue) ->
return callback(error) if error?
if gotLock
callback(null, lockValue)
else if Date.now() - startTime > LockManager.MAX_LOCK_WAIT_TIME
e = new Error("Lock timeout")
e.key = key
return callback(e)
else
setTimeout attempt, LockManager.LOCK_TEST_INTERVAL
releaseLock: (key, lockValue, callback = (error) ->) ->
existingLock = LockState[key]
if existingLock is lockValue # lockValue is an object, so we can test by reference
delete LockState[key] # our lock, so we can free it
callback()
else if existingLock? # lock exists but doesn't match ours
logger.error {key:key, lock: existingLock}, "tried to release lock taken by force"
callback()
else
logger.error {key:key, lock: existingLock}, "tried to release lock that has gone"
callback()
runWithLock: (key, runner = ( (releaseLock = (error) ->) -> ), callback = ( (error) -> )) ->
LockManager.getLock key, (error, lockValue) ->
return callback(error) if error?
runner (error1, args...) ->
LockManager.releaseLock key, lockValue, (error2) ->
error = error1 or error2
return callback(error) if error?
callback(null, args...)

View File

@@ -0,0 +1,358 @@
Settings = require "settings-sharelatex"
logger = require "logger-sharelatex"
Docker = require("dockerode")
dockerode = new Docker()
crypto = require "crypto"
async = require "async"
LockManager = require "./DockerLockManager"
fs = require "fs"
Path = require 'path'
_ = require "underscore"
logger.info "using docker runner"
usingSiblingContainers = () ->
Settings?.path?.sandboxedCompilesHostDir?
module.exports = DockerRunner =
ERR_NOT_DIRECTORY: new Error("not a directory")
ERR_TERMINATED: new Error("terminated")
ERR_EXITED: new Error("exited")
ERR_TIMED_OUT: new Error("container timed out")
run: (project_id, command, directory, image, timeout, environment, callback = (error, output) ->) ->
if usingSiblingContainers()
_newPath = Settings.path.sandboxedCompilesHostDir
logger.log {path: _newPath}, "altering bind path for sibling containers"
# Server Pro, example:
# '/var/lib/sharelatex/data/compiles/<project-id>'
# ... becomes ...
# '/opt/sharelatex_data/data/compiles/<project-id>'
directory = Path.join(Settings.path.sandboxedCompilesHostDir, Path.basename(directory))
volumes = {}
volumes[directory] = "/compile"
command = (arg.toString().replace?('$COMPILE_DIR', "/compile") for arg in command)
if !image?
image = Settings.clsi.docker.image
if Settings.texliveImageNameOveride?
img = image.split("/")
image = "#{Settings.texliveImageNameOveride}/#{img[2]}"
options = DockerRunner._getContainerOptions(command, image, volumes, timeout, environment)
fingerprint = DockerRunner._fingerprintContainer(options)
options.name = name = "project-#{project_id}-#{fingerprint}"
# logOptions = _.clone(options)
# logOptions?.HostConfig?.SecurityOpt = "secomp used, removed in logging"
logger.log project_id: project_id, "running docker container"
DockerRunner._runAndWaitForContainer options, volumes, timeout, (error, output) ->
if error?.message?.match("HTTP code is 500")
logger.log err: error, project_id: project_id, "error running container so destroying and retrying"
DockerRunner.destroyContainer name, null, true, (error) ->
return callback(error) if error?
DockerRunner._runAndWaitForContainer options, volumes, timeout, callback
else
callback(error, output)
return name # pass back the container name to allow it to be killed
kill: (container_id, callback = (error) ->) ->
logger.log container_id: container_id, "sending kill signal to container"
container = dockerode.getContainer(container_id)
container.kill (error) ->
if error? and error?.message?.match?(/Cannot kill container .* is not running/)
logger.warn err: error, container_id: container_id, "container not running, continuing"
error = null
if error?
logger.error err: error, container_id: container_id, "error killing container"
return callback(error)
else
callback()
_runAndWaitForContainer: (options, volumes, timeout, _callback = (error, output) ->) ->
callback = (args...) ->
_callback(args...)
# Only call the callback once
_callback = () ->
name = options.name
streamEnded = false
containerReturned = false
output = {}
callbackIfFinished = () ->
if streamEnded and containerReturned
callback(null, output)
attachStreamHandler = (error, _output) ->
return callback(error) if error?
output = _output
streamEnded = true
callbackIfFinished()
DockerRunner.startContainer options, volumes, attachStreamHandler, (error, containerId) ->
return callback(error) if error?
DockerRunner.waitForContainer name, timeout, (error, exitCode) ->
return callback(error) if error?
if exitCode is 137 # exit status from kill -9
err = DockerRunner.ERR_TERMINATED
err.terminated = true
return callback(err)
if exitCode is 1 # exit status from chktex
err = DockerRunner.ERR_EXITED
err.code = exitCode
return callback(err)
containerReturned = true
options?.HostConfig?.SecurityOpt = null #small log line
logger.log err:err, exitCode:exitCode, options:options, "docker container has exited"
callbackIfFinished()
_getContainerOptions: (command, image, volumes, timeout, environment) ->
timeoutInSeconds = timeout / 1000
dockerVolumes = {}
for hostVol, dockerVol of volumes
dockerVolumes[dockerVol] = {}
if volumes[hostVol].slice(-3).indexOf(":r") == -1
volumes[hostVol] = "#{dockerVol}:rw"
# merge settings and environment parameter
env = {}
for src in [Settings.clsi.docker.env, environment or {}]
env[key] = value for key, value of src
# set the path based on the image year
if m = image.match /:([0-9]+)\.[0-9]+/
year = m[1]
else
year = "2014"
env['PATH'] = "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/texlive/#{year}/bin/x86_64-linux/"
options =
"Cmd" : command,
"Image" : image
"Volumes" : dockerVolumes
"WorkingDir" : "/compile"
"NetworkDisabled" : true
"Memory" : 1024 * 1024 * 1024 * 1024 # 1 Gb
"User" : Settings.clsi.docker.user
"Env" : ("#{key}=#{value}" for key, value of env) # convert the environment hash to an array
"HostConfig" :
"Binds": ("#{hostVol}:#{dockerVol}" for hostVol, dockerVol of volumes)
"LogConfig": {"Type": "none", "Config": {}}
"Ulimits": [{'Name': 'cpu', 'Soft': timeoutInSeconds+5, 'Hard': timeoutInSeconds+10}]
"CapDrop": "ALL"
"SecurityOpt": ["no-new-privileges"]
if Settings.path?.synctexBinHostPath?
options["HostConfig"]["Binds"].push("#{Settings.path.synctexBinHostPath}:/opt/synctex:ro")
if Settings.clsi.docker.seccomp_profile?
options.HostConfig.SecurityOpt.push "seccomp=#{Settings.clsi.docker.seccomp_profile}"
return options
_fingerprintContainer: (containerOptions) ->
# Yay, Hashing!
json = JSON.stringify(containerOptions)
return crypto.createHash("md5").update(json).digest("hex")
startContainer: (options, volumes, attachStreamHandler, callback) ->
LockManager.runWithLock options.name, (releaseLock) ->
# Check that volumes exist before starting the container.
# When a container is started with volume pointing to a
# non-existent directory then docker creates the directory but
# with root ownership.
DockerRunner._checkVolumes options, volumes, (err) ->
return releaseLock(err) if err?
DockerRunner._startContainer options, volumes, attachStreamHandler, releaseLock
, callback
# Check that volumes exist and are directories
_checkVolumes: (options, volumes, callback = (error, containerName) ->) ->
if usingSiblingContainers()
# Server Pro, with sibling-containers active, skip checks
return callback(null)
checkVolume = (path, cb) ->
fs.stat path, (err, stats) ->
return cb(err) if err?
return cb(DockerRunner.ERR_NOT_DIRECTORY) if not stats?.isDirectory()
cb()
jobs = []
for vol of volumes
do (vol) ->
jobs.push (cb) -> checkVolume(vol, cb)
async.series jobs, callback
_startContainer: (options, volumes, attachStreamHandler, callback = ((error, output) ->)) ->
callback = _.once(callback)
name = options.name
logger.log {container_name: name}, "starting container"
container = dockerode.getContainer(name)
createAndStartContainer = ->
dockerode.createContainer options, (error, container) ->
return callback(error) if error?
startExistingContainer()
startExistingContainer = ->
DockerRunner.attachToContainer options.name, attachStreamHandler, (error)->
return callback(error) if error?
container.start (error) ->
if error? and error?.statusCode != 304 #already running
return callback(error)
else
callback()
container.inspect (error, stats)->
if error?.statusCode == 404
createAndStartContainer()
else if error?
logger.err {container_name: name, error:error}, "unable to inspect container to start"
return callback(error)
else
startExistingContainer()
attachToContainer: (containerId, attachStreamHandler, attachStartCallback) ->
container = dockerode.getContainer(containerId)
container.attach {stdout: 1, stderr: 1, stream: 1}, (error, stream) ->
if error?
logger.error err: error, container_id: containerId, "error attaching to container"
return attachStartCallback(error)
else
attachStartCallback()
logger.log container_id: containerId, "attached to container"
MAX_OUTPUT = 1024 * 1024 # limit output to 1MB
createStringOutputStream = (name) ->
return {
data: ""
overflowed: false
write: (data) ->
return if @overflowed
if @data.length < MAX_OUTPUT
@data += data
else
logger.error container_id: containerId, length: @data.length, maxLen: MAX_OUTPUT, "#{name} exceeds max size"
@data += "(...truncated at #{MAX_OUTPUT} chars...)"
@overflowed = true
# kill container if too much output
# docker.containers.kill(containerId, () ->)
}
stdout = createStringOutputStream "stdout"
stderr = createStringOutputStream "stderr"
container.modem.demuxStream(stream, stdout, stderr)
stream.on "error", (err) ->
logger.error err: err, container_id: containerId, "error reading from container stream"
stream.on "end", () ->
attachStreamHandler null, {stdout: stdout.data, stderr: stderr.data}
waitForContainer: (containerId, timeout, _callback = (error, exitCode) ->) ->
callback = (args...) ->
_callback(args...)
# Only call the callback once
_callback = () ->
container = dockerode.getContainer(containerId)
timedOut = false
timeoutId = setTimeout () ->
timedOut = true
logger.log container_id: containerId, "timeout reached, killing container"
container.kill(() ->)
, timeout
logger.log container_id: containerId, "waiting for docker container"
container.wait (error, res) ->
if error?
clearTimeout timeoutId
logger.error err: error, container_id: containerId, "error waiting for container"
return callback(error)
if timedOut
logger.log containerId: containerId, "docker container timed out"
error = DockerRunner.ERR_TIMED_OUT
error.timedout = true
callback error
else
clearTimeout timeoutId
logger.log container_id: containerId, exitCode: res.StatusCode, "docker container returned"
callback null, res.StatusCode
destroyContainer: (containerName, containerId, shouldForce, callback = (error) ->) ->
# We want the containerName for the lock and, ideally, the
# containerId to delete. There is a bug in the docker.io module
# where if you delete by name and there is an error, it throws an
# async exception, but if you delete by id it just does a normal
# error callback. We fall back to deleting by name if no id is
# supplied.
LockManager.runWithLock containerName, (releaseLock) ->
DockerRunner._destroyContainer containerId or containerName, shouldForce, releaseLock
, callback
_destroyContainer: (containerId, shouldForce, callback = (error) ->) ->
logger.log container_id: containerId, "destroying docker container"
container = dockerode.getContainer(containerId)
container.remove {force: shouldForce == true}, (error) ->
if error? and error?.statusCode == 404
logger.warn err: error, container_id: containerId, "container not found, continuing"
error = null
if error?
logger.error err: error, container_id: containerId, "error destroying container"
else
logger.log container_id: containerId, "destroyed container"
callback(error)
# handle expiry of docker containers
MAX_CONTAINER_AGE: Settings.clsi.docker.maxContainerAge or oneHour = 60 * 60 * 1000
examineOldContainer: (container, callback = (error, name, id, ttl)->) ->
name = container.Name or container.Names?[0]
created = container.Created * 1000 # creation time is returned in seconds
now = Date.now()
age = now - created
maxAge = DockerRunner.MAX_CONTAINER_AGE
ttl = maxAge - age
logger.log {containerName: name, created: created, now: now, age: age, maxAge: maxAge, ttl: ttl}, "checking whether to destroy container"
callback(null, name, container.Id, ttl)
destroyOldContainers: (callback = (error) ->) ->
dockerode.listContainers all: true, (error, containers) ->
return callback(error) if error?
jobs = []
for container in containers or []
do (container) ->
DockerRunner.examineOldContainer container, (err, name, id, ttl) ->
if name.slice(0, 9) == '/project-' && ttl <= 0
jobs.push (cb) ->
DockerRunner.destroyContainer name, id, false, () -> cb()
# Ignore errors because some containers get stuck but
# will be destroyed next time
async.series jobs, callback
startContainerMonitor: () ->
logger.log {maxAge: DockerRunner.MAX_CONTAINER_AGE}, "starting container expiry"
# randomise the start time
randomDelay = Math.floor(Math.random() * 5 * 60 * 1000)
setTimeout () ->
setInterval () ->
DockerRunner.destroyOldContainers()
, oneHour = 60 * 60 * 1000
, randomDelay
DockerRunner.startContainerMonitor()

View File

@@ -2,42 +2,36 @@ Path = require "path"
Settings = require "settings-sharelatex" Settings = require "settings-sharelatex"
logger = require "logger-sharelatex" logger = require "logger-sharelatex"
Metrics = require "./Metrics" Metrics = require "./Metrics"
CommandRunner = require(Settings.clsi?.commandRunner or "./CommandRunner") CommandRunner = require "./CommandRunner"
ProcessTable = {} # table of currently running jobs (pids or docker container names) ProcessTable = {} # table of currently running jobs (pids or docker container names)
module.exports = LatexRunner = module.exports = LatexRunner =
runLatex: (project_id, options, callback = (error) ->) -> runLatex: (project_id, options, callback = (error) ->) ->
{directory, mainFile, compiler, timeout, image, environment} = options {directory, mainFile, compiler, timeout, image, environment, flags} = options
compiler ||= "pdflatex" compiler ||= "pdflatex"
timeout ||= 60000 # milliseconds timeout ||= 60000 # milliseconds
logger.log directory: directory, compiler: compiler, timeout: timeout, mainFile: mainFile, environment: environment, "starting compile" logger.log directory: directory, compiler: compiler, timeout: timeout, mainFile: mainFile, environment: environment, flags:flags, "starting compile"
# We want to run latexmk on the tex file which we will automatically # We want to run latexmk on the tex file which we will automatically
# generate from the Rtex/Rmd/md file. # generate from the Rtex/Rmd/md file.
mainFile = mainFile.replace(/\.(Rtex|md|Rmd)$/, ".md") mainFile = mainFile.replace(/\.(Rtex|md|Rmd)$/, ".tex")
if compiler == "pdflatex" if compiler == "pdflatex"
command = LatexRunner._pdflatexCommand mainFile command = LatexRunner._pdflatexCommand mainFile, flags
else if compiler == "latex" else if compiler == "latex"
command = LatexRunner._latexCommand mainFile command = LatexRunner._latexCommand mainFile, flags
else if compiler == "xelatex" else if compiler == "xelatex"
command = LatexRunner._xelatexCommand mainFile command = LatexRunner._xelatexCommand mainFile, flags
else if compiler == "lualatex" else if compiler == "lualatex"
command = LatexRunner._lualatexCommand mainFile command = LatexRunner._lualatexCommand mainFile, flags
else else
return callback new Error("unknown compiler: #{compiler}") return callback new Error("unknown compiler: #{compiler}")
if Settings.clsi?.strace if Settings.clsi?.strace
command = ["strace", "-o", "strace", "-ff"].concat(command) command = ["strace", "-o", "strace", "-ff"].concat(command)
# ignore the above and make a pandoc command
console.log(mainFile)
console.log(image)
image = "ivotron/pandoc"
command = ["-o", "$COMPILE_DIR/output.html", "/compile/" + mainFile]
id = "#{project_id}" # record running project under this id id = "#{project_id}" # record running project under this id
ProcessTable[id] = CommandRunner.run project_id, command, directory, image, timeout, environment, (error, output) -> ProcessTable[id] = CommandRunner.run project_id, command, directory, image, timeout, environment, (error, output) ->
@@ -69,31 +63,32 @@ module.exports = LatexRunner =
else else
CommandRunner.kill ProcessTable[id], callback CommandRunner.kill ProcessTable[id], callback
_latexmkBaseCommand: (Settings?.clsi?.latexmkCommandPrefix || []).concat([ _latexmkBaseCommand: (flags) ->
"latexmk", "-cd", "-f", "-jobname=output", "-auxdir=$COMPILE_DIR", "-outdir=$COMPILE_DIR", args = ["latexmk", "-cd", "-f", "-jobname=output", "-auxdir=$COMPILE_DIR", "-outdir=$COMPILE_DIR", "-synctex=1","-interaction=batchmode"]
"-synctex=1","-interaction=batchmode" if flags
]) args = args.concat(flags)
(Settings?.clsi?.latexmkCommandPrefix || []).concat(args)
_pdflatexCommand: (mainFile) -> _pdflatexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand.concat [ LatexRunner._latexmkBaseCommand(flags).concat [
"-pdf", "-pdf",
Path.join("$COMPILE_DIR", mainFile) Path.join("$COMPILE_DIR", mainFile)
] ]
_latexCommand: (mainFile) -> _latexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand.concat [ LatexRunner._latexmkBaseCommand(flags).concat [
"-pdfdvi", "-pdfdvi",
Path.join("$COMPILE_DIR", mainFile) Path.join("$COMPILE_DIR", mainFile)
] ]
_xelatexCommand: (mainFile) -> _xelatexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand.concat [ LatexRunner._latexmkBaseCommand(flags).concat [
"-xelatex", "-xelatex",
Path.join("$COMPILE_DIR", mainFile) Path.join("$COMPILE_DIR", mainFile)
] ]
_lualatexCommand: (mainFile) -> _lualatexCommand: (mainFile, flags) ->
LatexRunner._latexmkBaseCommand.concat [ LatexRunner._latexmkBaseCommand(flags).concat [
"-lualatex", "-lualatex",
Path.join("$COMPILE_DIR", mainFile) Path.join("$COMPILE_DIR", mainFile)
] ]

View File

@@ -0,0 +1,48 @@
spawn = require("child_process").spawn
logger = require "logger-sharelatex"
logger.info "using standard command runner"
module.exports = CommandRunner =
run: (project_id, command, directory, image, timeout, environment, callback = (error) ->) ->
command = (arg.toString().replace('$COMPILE_DIR', directory) for arg in command)
logger.log project_id: project_id, command: command, directory: directory, "running command"
logger.warn "timeouts and sandboxing are not enabled with CommandRunner"
# merge environment settings
env = {}
env[key] = value for key, value of process.env
env[key] = value for key, value of environment
# run command as detached process so it has its own process group (which can be killed if needed)
proc = spawn command[0], command.slice(1), cwd: directory, env: env
stdout = ""
proc.stdout.on "data", (data)->
stdout += data
proc.on "error", (err)->
logger.err err:err, project_id:project_id, command: command, directory: directory, "error running command"
callback(err)
proc.on "close", (code, signal) ->
logger.info code:code, signal:signal, project_id:project_id, "command exited"
if signal is 'SIGTERM' # signal from kill method below
err = new Error("terminated")
err.terminated = true
return callback(err)
else if code is 1 # exit status from chktex
err = new Error("exited")
err.code = code
return callback(err)
else
callback(null, {"stdout": stdout})
return proc.pid # return process id to allow job to be killed if necessary
kill: (pid, callback = (error) ->) ->
try
process.kill -pid # kill all processes in group
catch err
return callback(err)
callback()

View File

@@ -2,7 +2,8 @@ Settings = require('settings-sharelatex')
logger = require "logger-sharelatex" logger = require "logger-sharelatex"
Lockfile = require('lockfile') # from https://github.com/npm/lockfile Lockfile = require('lockfile') # from https://github.com/npm/lockfile
Errors = require "./Errors" Errors = require "./Errors"
fs = require("fs")
Path = require("path")
module.exports = LockManager = module.exports = LockManager =
LOCK_TEST_INTERVAL: 1000 # 50ms between each test of the lock LOCK_TEST_INTERVAL: 1000 # 50ms between each test of the lock
MAX_LOCK_WAIT_TIME: 15000 # 10s maximum time to spend trying to get the lock MAX_LOCK_WAIT_TIME: 15000 # 10s maximum time to spend trying to get the lock
@@ -14,10 +15,17 @@ module.exports = LockManager =
pollPeriod: @LOCK_TEST_INTERVAL pollPeriod: @LOCK_TEST_INTERVAL
stale: @LOCK_STALE stale: @LOCK_STALE
Lockfile.lock path, lockOpts, (error) -> Lockfile.lock path, lockOpts, (error) ->
return callback new Errors.AlreadyCompilingError("compile in progress") if error?.code is 'EEXIST' if error?.code is 'EEXIST'
return callback(error) if error? return callback new Errors.AlreadyCompilingError("compile in progress")
runner (error1, args...) -> else if error?
Lockfile.unlock path, (error2) -> fs.lstat path, (statLockErr, statLock)->
error = error1 or error2 fs.lstat Path.dirname(path), (statDirErr, statDir)->
return callback(error) if error? fs.readdir Path.dirname(path), (readdirErr, readdirDir)->
callback(null, args...) logger.err error:error, path:path, statLock:statLock, statLockErr:statLockErr, statDir:statDir, statDirErr: statDirErr, readdirErr:readdirErr, readdirDir:readdirDir, "unable to get lock"
return callback(error)
else
runner (error1, args...) ->
Lockfile.unlock path, (error2) ->
error = error1 or error2
return callback(error) if error?
callback(null, args...)

View File

@@ -10,8 +10,6 @@ module.exports = OutputFileFinder =
for resource in resources for resource in resources
incomingResources[resource.path] = true incomingResources[resource.path] = true
logger.log directory: directory, "getting output files"
OutputFileFinder._getAllFiles directory, (error, allFiles = []) -> OutputFileFinder._getAllFiles directory, (error, allFiles = []) ->
if error? if error?
logger.err err:error, "error finding all output files" logger.err err:error, "error finding all output files"

View File

@@ -1,6 +1,7 @@
UrlCache = require "./UrlCache" UrlCache = require "./UrlCache"
CompileManager = require "./CompileManager" CompileManager = require "./CompileManager"
db = require "./db" db = require "./db"
dbQueue = require "./DbQueue"
async = require "async" async = require "async"
logger = require "logger-sharelatex" logger = require "logger-sharelatex"
oneDay = 24 * 60 * 60 * 1000 oneDay = 24 * 60 * 60 * 1000
@@ -11,14 +12,17 @@ module.exports = ProjectPersistenceManager =
EXPIRY_TIMEOUT: Settings.project_cache_length_ms || oneDay * 2.5 EXPIRY_TIMEOUT: Settings.project_cache_length_ms || oneDay * 2.5
markProjectAsJustAccessed: (project_id, callback = (error) ->) -> markProjectAsJustAccessed: (project_id, callback = (error) ->) ->
db.Project.findOrCreate(where: {project_id: project_id}) job = (cb)->
.spread( db.Project.findOrCreate(where: {project_id: project_id})
(project, created) -> .spread(
project.updateAttributes(lastAccessed: new Date()) (project, created) ->
.then(() -> callback()) project.updateAttributes(lastAccessed: new Date())
.error callback .then(() -> cb())
) .error cb
.error callback )
.error cb
dbQueue.queue.push(job, callback)
clearExpiredProjects: (callback = (error) ->) -> clearExpiredProjects: (callback = (error) ->) ->
ProjectPersistenceManager._findExpiredProjectIds (error, project_ids) -> ProjectPersistenceManager._findExpiredProjectIds (error, project_ids) ->
@@ -47,20 +51,34 @@ module.exports = ProjectPersistenceManager =
clearProjectFromCache: (project_id, callback = (error) ->) -> clearProjectFromCache: (project_id, callback = (error) ->) ->
logger.log project_id: project_id, "clearing project from cache" logger.log project_id: project_id, "clearing project from cache"
UrlCache.clearProject project_id, (error) -> UrlCache.clearProject project_id, (error) ->
return callback(error) if error? if error?
logger.err error:error, project_id: project_id, "error clearing project from cache"
return callback(error)
ProjectPersistenceManager._clearProjectFromDatabase project_id, (error) -> ProjectPersistenceManager._clearProjectFromDatabase project_id, (error) ->
return callback(error) if error? if error?
callback() logger.err error:error, project_id:project_id, "error clearing project from database"
callback(error)
_clearProjectFromDatabase: (project_id, callback = (error) ->) -> _clearProjectFromDatabase: (project_id, callback = (error) ->) ->
db.Project.destroy(where: {project_id: project_id}) logger.log project_id:project_id, "clearing project from database"
.then(() -> callback()) job = (cb)->
.error callback db.Project.destroy(where: {project_id: project_id})
.then(() -> cb())
.error cb
dbQueue.queue.push(job, callback)
_findExpiredProjectIds: (callback = (error, project_ids) ->) -> _findExpiredProjectIds: (callback = (error, project_ids) ->) ->
db.Project.findAll(where: ["lastAccessed < ?", new Date(Date.now() - ProjectPersistenceManager.EXPIRY_TIMEOUT)]) job = (cb)->
.then((projects) -> keepProjectsFrom = new Date(Date.now() - ProjectPersistenceManager.EXPIRY_TIMEOUT)
callback null, projects.map((project) -> project.project_id) q = {}
).error callback q[db.op.lt] = keepProjectsFrom
db.Project.findAll(where:{lastAccessed:q})
.then((projects) ->
cb null, projects.map((project) -> project.project_id)
).error cb
dbQueue.queue.push(job, callback)
logger.log {EXPIRY_TIMEOUT: ProjectPersistenceManager.EXPIRY_TIMEOUT}, "project assets kept timeout" logger.log {EXPIRY_TIMEOUT: ProjectPersistenceManager.EXPIRY_TIMEOUT}, "project assets kept timeout"

View File

@@ -1,6 +1,8 @@
settings = require("settings-sharelatex")
module.exports = RequestParser = module.exports = RequestParser =
VALID_COMPILERS: ["pdflatex", "latex", "xelatex", "lualatex"] VALID_COMPILERS: ["pdflatex", "latex", "xelatex", "lualatex"]
MAX_TIMEOUT: 300 MAX_TIMEOUT: 600
parse: (body, callback = (error, data) ->) -> parse: (body, callback = (error, data) ->) ->
response = {} response = {}
@@ -10,7 +12,7 @@ module.exports = RequestParser =
compile = body.compile compile = body.compile
compile.options ||= {} compile.options ||= {}
try try
response.compiler = @_parseAttribute "compiler", response.compiler = @_parseAttribute "compiler",
compile.options.compiler, compile.options.compiler,
@@ -31,6 +33,10 @@ module.exports = RequestParser =
response.check = @_parseAttribute "check", response.check = @_parseAttribute "check",
compile.options.check, compile.options.check,
type: "string" type: "string"
response.flags = @_parseAttribute "flags",
compile.options.flags,
default: [],
type: "object"
# The syncType specifies whether the request contains all # The syncType specifies whether the request contains all
# resources (full) or only those resources to be updated # resources (full) or only those resources to be updated
@@ -66,7 +72,7 @@ module.exports = RequestParser =
originalRootResourcePath = rootResourcePath originalRootResourcePath = rootResourcePath
sanitizedRootResourcePath = RequestParser._sanitizePath(rootResourcePath) sanitizedRootResourcePath = RequestParser._sanitizePath(rootResourcePath)
response.rootResourcePath = RequestParser._checkPath(sanitizedRootResourcePath) response.rootResourcePath = RequestParser._checkPath(sanitizedRootResourcePath)
for resource in response.resources for resource in response.resources
if resource.path == originalRootResourcePath if resource.path == originalRootResourcePath
resource.path = sanitizedRootResourcePath resource.path = sanitizedRootResourcePath
@@ -85,7 +91,7 @@ module.exports = RequestParser =
throw "resource modified date could not be understood: #{resource.modified}" throw "resource modified date could not be understood: #{resource.modified}"
if !resource.url? and !resource.content? if !resource.url? and !resource.content?
throw "all resources should have either a url or content attribute" throw "all resources should have either a url or content attribute"
if resource.content? and typeof resource.content != "string" if resource.content? and typeof resource.content != "string"
throw "content attribute should be a string" throw "content attribute should be a string"
if resource.url? and typeof resource.url != "string" if resource.url? and typeof resource.url != "string"

View File

@@ -78,8 +78,16 @@ module.exports = ResourceWriter =
should_delete = true should_delete = true
if path.match(/^output\./) or path.match(/\.aux$/) or path.match(/^cache\//) # knitr cache if path.match(/^output\./) or path.match(/\.aux$/) or path.match(/^cache\//) # knitr cache
should_delete = false should_delete = false
if path.match(/^output-.*/) # Tikz cached figures if path.match(/^output-.*/) # Tikz cached figures (default case)
should_delete = false should_delete = false
if path.match(/\.(pdf|dpth|md5)$/) # Tikz cached figures (by extension)
should_delete = false
if path.match(/\.(pygtex|pygstyle)$/) or path.match(/(^|\/)_minted-[^\/]+\//) # minted files/directory
should_delete = false
if path.match(/\.md\.tex$/) or path.match(/(^|\/)_markdown_[^\/]+\//) # markdown files/directory
should_delete = false
if path.match(/-eps-converted-to\.pdf$/) # Epstopdf generated files
should_delete = false
if path == "output.pdf" or path == "output.dvi" or path == "output.log" or path == "output.xdv" if path == "output.pdf" or path == "output.dvi" or path == "output.log" or path == "output.xdv"
should_delete = true should_delete = true
if path == "output.tex" # created by TikzManager if present in output files if path == "output.tex" # created by TikzManager if present in output files
@@ -120,7 +128,11 @@ module.exports = ResourceWriter =
logger.err err:err, project_id:project_id, path:path, resource_url:resource.url, modified:resource.modified, "error downloading file for resources" logger.err err:err, project_id:project_id, path:path, resource_url:resource.url, modified:resource.modified, "error downloading file for resources"
callback() #try and continue compiling even if http resource can not be downloaded at this time callback() #try and continue compiling even if http resource can not be downloaded at this time
else else
process = require("process")
fs.writeFile path, resource.content, callback fs.writeFile path, resource.content, callback
try
result = fs.lstatSync(path)
catch e
checkPath: (basePath, resourcePath, callback) -> checkPath: (basePath, resourcePath, callback) ->
path = Path.normalize(Path.join(basePath, resourcePath)) path = Path.normalize(Path.join(basePath, resourcePath))

View File

@@ -4,32 +4,34 @@ ResourceWriter = require "./ResourceWriter"
SafeReader = require "./SafeReader" SafeReader = require "./SafeReader"
logger = require "logger-sharelatex" logger = require "logger-sharelatex"
# for \tikzexternalize to work the main file needs to match the # for \tikzexternalize or pstool to work the main file needs to match the
# jobname. Since we set the -jobname to output, we have to create a # jobname. Since we set the -jobname to output, we have to create a
# copy of the main file as 'output.tex'. # copy of the main file as 'output.tex'.
module.exports = TikzManager = module.exports = TikzManager =
checkMainFile: (compileDir, mainFile, resources, callback = (error, usesTikzExternalize) ->) -> checkMainFile: (compileDir, mainFile, resources, callback = (error, needsMainFile) ->) ->
# if there's already an output.tex file, we don't want to touch it # if there's already an output.tex file, we don't want to touch it
for resource in resources for resource in resources
if resource.path is "output.tex" if resource.path is "output.tex"
logger.log compileDir: compileDir, mainFile: mainFile, "output.tex already in resources" logger.log compileDir: compileDir, mainFile: mainFile, "output.tex already in resources"
return callback(null, false) return callback(null, false)
# if there's no output.tex, see if we are using tikz/pgf in the main file # if there's no output.tex, see if we are using tikz/pgf or pstool in the main file
ResourceWriter.checkPath compileDir, mainFile, (error, path) -> ResourceWriter.checkPath compileDir, mainFile, (error, path) ->
return callback(error) if error? return callback(error) if error?
SafeReader.readFile path, 65536, "utf8", (error, content) -> SafeReader.readFile path, 65536, "utf8", (error, content) ->
return callback(error) if error? return callback(error) if error?
usesTikzExternalize = content?.indexOf("\\tikzexternalize") >= 0 usesTikzExternalize = content?.indexOf("\\tikzexternalize") >= 0
logger.log compileDir: compileDir, mainFile: mainFile, usesTikzExternalize:usesTikzExternalize, "checked for tikzexternalize" usesPsTool = content?.indexOf("{pstool}") >= 0
callback null, usesTikzExternalize logger.log compileDir: compileDir, mainFile: mainFile, usesTikzExternalize:usesTikzExternalize, usesPsTool: usesPsTool, "checked for packages needing main file as output.tex"
needsMainFile = (usesTikzExternalize || usesPsTool)
callback null, needsMainFile
injectOutputFile: (compileDir, mainFile, callback = (error) ->) -> injectOutputFile: (compileDir, mainFile, callback = (error) ->) ->
ResourceWriter.checkPath compileDir, mainFile, (error, path) -> ResourceWriter.checkPath compileDir, mainFile, (error, path) ->
return callback(error) if error? return callback(error) if error?
fs.readFile path, "utf8", (error, content) -> fs.readFile path, "utf8", (error, content) ->
return callback(error) if error? return callback(error) if error?
logger.log compileDir: compileDir, mainFile: mainFile, "copied file to output.tex for tikz" logger.log compileDir: compileDir, mainFile: mainFile, "copied file to output.tex as project uses packages which require it"
# use wx flag to ensure that output file does not already exist # use wx flag to ensure that output file does not already exist
fs.writeFile Path.join(compileDir, "output.tex"), content, {flag:'wx'}, callback fs.writeFile Path.join(compileDir, "output.tex"), content, {flag:'wx'}, callback

View File

@@ -1,4 +1,5 @@
db = require("./db") db = require("./db")
dbQueue = require "./DbQueue"
UrlFetcher = require("./UrlFetcher") UrlFetcher = require("./UrlFetcher")
Settings = require("settings-sharelatex") Settings = require("settings-sharelatex")
crypto = require("crypto") crypto = require("crypto")
@@ -51,7 +52,6 @@ module.exports = UrlCache =
_doesUrlNeedDownloading: (project_id, url, lastModified, callback = (error, needsDownloading) ->) -> _doesUrlNeedDownloading: (project_id, url, lastModified, callback = (error, needsDownloading) ->) ->
if !lastModified? if !lastModified?
return callback null, true return callback null, true
UrlCache._findUrlDetails project_id, url, (error, urlDetails) -> UrlCache._findUrlDetails project_id, url, (error, urlDetails) ->
return callback(error) if error? return callback(error) if error?
if !urlDetails? or !urlDetails.lastModified? or urlDetails.lastModified.getTime() < lastModified.getTime() if !urlDetails? or !urlDetails.lastModified? or urlDetails.lastModified.getTime() < lastModified.getTime()
@@ -94,32 +94,41 @@ module.exports = UrlCache =
return callback() return callback()
_findUrlDetails: (project_id, url, callback = (error, urlDetails) ->) -> _findUrlDetails: (project_id, url, callback = (error, urlDetails) ->) ->
db.UrlCache.find(where: { url: url, project_id: project_id }) job = (cb)->
.then((urlDetails) -> callback null, urlDetails) db.UrlCache.find(where: { url: url, project_id: project_id })
.error callback .then((urlDetails) -> cb null, urlDetails)
.error cb
dbQueue.queue.push job, callback
_updateOrCreateUrlDetails: (project_id, url, lastModified, callback = (error) ->) -> _updateOrCreateUrlDetails: (project_id, url, lastModified, callback = (error) ->) ->
db.UrlCache.findOrCreate(where: {url: url, project_id: project_id}) job = (cb)->
.spread( db.UrlCache.findOrCreate(where: {url: url, project_id: project_id})
(urlDetails, created) -> .spread(
urlDetails.updateAttributes(lastModified: lastModified) (urlDetails, created) ->
.then(() -> callback()) urlDetails.updateAttributes(lastModified: lastModified)
.error(callback) .then(() -> cb())
) .error(cb)
.error callback )
.error cb
dbQueue.queue.push(job, callback)
_clearUrlDetails: (project_id, url, callback = (error) ->) -> _clearUrlDetails: (project_id, url, callback = (error) ->) ->
db.UrlCache.destroy(where: {url: url, project_id: project_id}) job = (cb)->
.then(() -> callback null) db.UrlCache.destroy(where: {url: url, project_id: project_id})
.error callback .then(() -> cb null)
.error cb
dbQueue.queue.push(job, callback)
_findAllUrlsInProject: (project_id, callback = (error, urls) ->) -> _findAllUrlsInProject: (project_id, callback = (error, urls) ->) ->
db.UrlCache.findAll(where: { project_id: project_id }) job = (cb)->
.then( db.UrlCache.findAll(where: { project_id: project_id })
(urlEntries) -> .then(
callback null, urlEntries.map((entry) -> entry.url) (urlEntries) ->
) cb null, urlEntries.map((entry) -> entry.url)
.error callback )
.error cb
dbQueue.queue.push(job, callback)

View File

@@ -1,6 +1,8 @@
request = require("request").defaults(jar: false) request = require("request").defaults(jar: false)
fs = require("fs") fs = require("fs")
logger = require "logger-sharelatex" logger = require "logger-sharelatex"
settings = require("settings-sharelatex")
URL = require('url');
oneMinute = 60 * 1000 oneMinute = 60 * 1000
@@ -11,6 +13,9 @@ module.exports = UrlFetcher =
_callback(error) _callback(error)
_callback = () -> _callback = () ->
if settings.filestoreDomainOveride?
p = URL.parse(url).path
url = "#{settings.filestoreDomainOveride}#{p}"
timeoutHandler = setTimeout () -> timeoutHandler = setTimeout () ->
timeoutHandler = null timeoutHandler = null
logger.error url:url, filePath: filePath, "Timed out downloading file to cache" logger.error url:url, filePath: filePath, "Timed out downloading file to cache"

View File

@@ -1,9 +1,12 @@
Sequelize = require("sequelize") Sequelize = require("sequelize")
Settings = require("settings-sharelatex") Settings = require("settings-sharelatex")
_ = require("underscore") _ = require("underscore")
logger = require "logger-sharelatex"
options = _.extend {logging:false}, Settings.mysql.clsi options = _.extend {logging:false}, Settings.mysql.clsi
logger.log dbPath:Settings.mysql.clsi.storage, "connecting to db"
sequelize = new Sequelize( sequelize = new Sequelize(
Settings.mysql.clsi.database, Settings.mysql.clsi.database,
Settings.mysql.clsi.username, Settings.mysql.clsi.username,
@@ -11,6 +14,12 @@ sequelize = new Sequelize(
options options
) )
if Settings.mysql.clsi.dialect == "sqlite"
logger.log "running PRAGMA journal_mode=WAL;"
sequelize.query("PRAGMA journal_mode=WAL;")
sequelize.query("PRAGMA synchronous=OFF;")
sequelize.query("PRAGMA read_uncommitted = true;")
module.exports = module.exports =
UrlCache: sequelize.define("UrlCache", { UrlCache: sequelize.define("UrlCache", {
url: Sequelize.STRING url: Sequelize.STRING
@@ -32,5 +41,15 @@ module.exports =
] ]
}) })
sync: () -> sequelize.sync() op: Sequelize.Op
sync: () ->
logger.log dbPath:Settings.mysql.clsi.storage, "syncing db schema"
sequelize.sync()
.then(->
logger.log "db sync complete"
).catch((err)->
console.log err, "error syncing"
)

4
bin/acceptance_test Normal file
View File

@@ -0,0 +1,4 @@
#!/bin/bash
set -e;
MOCHA="node_modules/.bin/mocha --recursive --reporter spec --timeout 15000"
$MOCHA "$@"

BIN
bin/synctex Executable file

Binary file not shown.

9
buildscript.txt Normal file
View File

@@ -0,0 +1,9 @@
clsi
--language=coffeescript
--node-version=10.15.0
--acceptance-creds=None
--dependencies=mongo,redis
--docker-repos=gcr.io/overleaf-ops
--env-pass-through=TEXLIVE_IMAGE
--build-target=docker
--script-version=1.1.22

39
cloudbuild.yaml Normal file
View File

@@ -0,0 +1,39 @@
steps:
- id: texlive
name: 'gcr.io/overleaf-ops/texlive-full:2017.1'
- id: build
name: 'gcr.io/overleaf-ops/cloud-builder'
args:
- 'build'
env:
- 'BUILD_NUMBER=$SHORT_SHA'
- 'BRANCH_NAME=$BRANCH_NAME'
waitFor: ['-']
- id: test_unit
name: 'gcr.io/overleaf-ops/cloud-builder'
args:
- 'test_unit'
env:
- 'DOCKER_COMPOSE_FLAGS=-f docker-compose.ci.yml'
- 'BUILD_NUMBER=$SHORT_SHA'
- 'BRANCH_NAME=$BRANCH_NAME'
waitFor:
- build
- id: test_acceptance
name: 'gcr.io/overleaf-ops/cloud-builder'
args:
- 'test_acceptance'
env:
- 'DOCKER_COMPOSE_FLAGS=-f docker-compose.ci.yml'
- 'BUILD_NUMBER=$SHORT_SHA'
- 'BRANCH_NAME=$BRANCH_NAME'
- 'TEXLIVE_IMAGE=gcr.io/overleaf-ops/texlive-full:2017.1'
waitFor:
- build
- texlive
images:
- 'gcr.io/$PROJECT_ID/clsi:${BRANCH_NAME}-${SHORT_SHA}'
timeout: 1800s
options:
diskSizeGb: 200
machineType: 'N1_HIGHCPU_8'

View File

@@ -7,10 +7,16 @@ module.exports =
clsi: clsi:
database: "clsi" database: "clsi"
username: "clsi" username: "clsi"
password: null
dialect: "sqlite" dialect: "sqlite"
storage: Path.resolve(__dirname + "/../db.sqlite") storage: process.env["SQLITE_PATH"] or Path.resolve(__dirname + "/../db.sqlite")
pool:
max: 1
min: 1
retry:
max: 10
compileSizeLimit: process.env["COMPILE_SIZE_LIMIT"] or "7mb"
path: path:
compilesDir: Path.resolve(__dirname + "/../compiles") compilesDir: Path.resolve(__dirname + "/../compiles")
clsiCacheDir: Path.resolve(__dirname + "/../cache") clsiCacheDir: Path.resolve(__dirname + "/../cache")
@@ -20,19 +26,29 @@ module.exports =
clsi: clsi:
port: 3013 port: 3013
host: process.env["LISTEN_ADDRESS"] or "localhost" host: process.env["LISTEN_ADDRESS"] or "localhost"
load_balancer_agent:
report_load:true
load_port: 3048
local_port: 3049
apis: apis:
clsi: clsi:
url: "http://localhost:3013" url: "http://#{process.env['CLSI_HOST'] or 'localhost'}:3013"
smokeTest: false
project_cache_length_ms: 1000 * 60 * 60 * 24
parallelFileDownloads:1
if process.env["COMMAND_RUNNER"]
smokeTest: process.env["SMOKE_TEST"] or false
project_cache_length_ms: 1000 * 60 * 60 * 24
parallelFileDownloads: process.env["FILESTORE_PARALLEL_FILE_DOWNLOADS"] or 1
parallelSqlQueryLimit: process.env["FILESTORE_PARALLEL_SQL_QUERY_LIMIT"] or 1
filestoreDomainOveride: process.env["FILESTORE_DOMAIN_OVERRIDE"]
texliveImageNameOveride: process.env["TEX_LIVE_IMAGE_NAME_OVERRIDE"]
sentry:
dsn: process.env['SENTRY_DSN']
if process.env["DOCKER_RUNNER"]
module.exports.clsi = module.exports.clsi =
commandRunner: process.env["COMMAND_RUNNER"] dockerRunner: process.env["DOCKER_RUNNER"] == "true"
docker: docker:
image: process.env["TEXLIVE_IMAGE"] or "quay.io/sharelatex/texlive-full:2017.1" image: process.env["TEXLIVE_IMAGE"] or "quay.io/sharelatex/texlive-full:2017.1"
env: env:
@@ -41,4 +57,15 @@ if process.env["COMMAND_RUNNER"]
user: process.env["TEXLIVE_IMAGE_USER"] or "tex" user: process.env["TEXLIVE_IMAGE_USER"] or "tex"
expireProjectAfterIdleMs: 24 * 60 * 60 * 1000 expireProjectAfterIdleMs: 24 * 60 * 60 * 1000
checkProjectsIntervalMs: 10 * 60 * 1000 checkProjectsIntervalMs: 10 * 60 * 1000
try
seccomp_profile_path = Path.resolve(__dirname + "/../seccomp/clsi-profile.json")
module.exports.clsi.docker.seccomp_profile = JSON.stringify(JSON.parse(require("fs").readFileSync(seccomp_profile_path)))
catch error
console.log error, "could not load seccom profile from #{seccomp_profile_path}"
module.exports.path.synctexBaseDir = -> "/compile"
module.exports.path.sandboxedCompilesHostDir = process.env["COMPILES_HOST_DIR"] module.exports.path.sandboxedCompilesHostDir = process.env["COMPILES_HOST_DIR"]
module.exports.path.synctexBinHostPath = process.env["SYNCTEX_BIN_HOST_PATH"]

5
debug Executable file
View File

@@ -0,0 +1,5 @@
#!/bin/bash
echo "hello world"
sleep 3
echo "awake"
/opt/synctex pdf /compile/output.pdf 1 100 200

32
docker-compose-config.yml Normal file
View File

@@ -0,0 +1,32 @@
version: "2"
services:
dev:
environment:
TEXLIVE_IMAGE: quay.io/sharelatex/texlive-full:2017.1
TEXLIVE_IMAGE_USER: "tex"
SHARELATEX_CONFIG: /app/config/settings.defaults.coffee
DOCKER_RUNNER: "true"
COMPILES_HOST_DIR: $PWD/compiles
SYNCTEX_BIN_HOST_PATH: $PWD/bin/synctex
volumes:
- /var/run/docker.sock:/var/run/docker.sock
- ./compiles:/app/compiles
- ./cache:/app/cache
- ./bin/synctex:/app/bin/synctex
ci:
environment:
TEXLIVE_IMAGE: quay.io/sharelatex/texlive-full:2017.1
TEXLIVE_IMAGE_USER: "tex"
SHARELATEX_CONFIG: /app/config/settings.defaults.coffee
DOCKER_RUNNER: "true"
COMPILES_HOST_DIR: $PWD/compiles
SYNCTEX_BIN_HOST_PATH: $PWD/bin/synctex
SQLITE_PATH: /app/compiles/db.sqlite
volumes:
- /var/run/docker.sock:/var/run/docker.sock:rw
- ./compiles:/app/compiles
- ./cache:/app/cache
- ./bin/synctex:/app/bin/synctex

49
docker-compose.ci.yml Normal file
View File

@@ -0,0 +1,49 @@
# This file was auto-generated, do not edit it directly.
# Instead run bin/update_build_scripts from
# https://github.com/sharelatex/sharelatex-dev-environment
# Version: 1.1.22
version: "2"
services:
test_unit:
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
command: npm run test:unit:_run
environment:
NODE_ENV: test
test_acceptance:
build: .
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
extends:
file: docker-compose-config.yml
service: ci
environment:
ELASTIC_SEARCH_DSN: es:9200
REDIS_HOST: redis
MONGO_HOST: mongo
POSTGRES_HOST: postgres
MOCHA_GREP: ${MOCHA_GREP}
NODE_ENV: test
TEXLIVE_IMAGE:
depends_on:
- mongo
- redis
command: npm run test:acceptance:_run
tar:
build: .
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
volumes:
- ./:/tmp/build/
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
user: root
redis:
image: redis
mongo:
image: mongo:3.4

56
docker-compose.yml Normal file
View File

@@ -0,0 +1,56 @@
# This file was auto-generated, do not edit it directly.
# Instead run bin/update_build_scripts from
# https://github.com/sharelatex/sharelatex-dev-environment
# Version: 1.1.22
version: "2"
services:
test_unit:
build: .
volumes:
- .:/app
working_dir: /app
environment:
MOCHA_GREP: ${MOCHA_GREP}
NODE_ENV: test
command: npm run test:unit
test_acceptance:
build: .
volumes:
- .:/app
working_dir: /app
extends:
file: docker-compose-config.yml
service: dev
environment:
ELASTIC_SEARCH_DSN: es:9200
REDIS_HOST: redis
MONGO_HOST: mongo
POSTGRES_HOST: postgres
MOCHA_GREP: ${MOCHA_GREP}
LOG_LEVEL: ERROR
NODE_ENV: test
depends_on:
- mongo
- redis
command: npm run test:acceptance
tar:
build: .
image: ci/$PROJECT_NAME:$BRANCH_NAME-$BUILD_NUMBER
volumes:
- ./:/tmp/build/
command: tar -czf /tmp/build/build.tar.gz --exclude=build.tar.gz --exclude-vcs .
user: root
redis:
image: redis
mongo:
image: mongo:3.4

6
entrypoint.sh Normal file
View File

@@ -0,0 +1,6 @@
#!/bin/bash
set -o pipefail
/app/inner-entrypoint.sh "$@" 2>&1 | ts

27
inner-entrypoint.sh Executable file
View File

@@ -0,0 +1,27 @@
#!/bin/sh
set -x
date
echo "Changing permissions of /var/run/docker.sock for sibling containers"
ls -al /var/run/docker.sock
docker --version
cat /etc/passwd
DOCKER_GROUP=$(stat -c '%g' /var/run/docker.sock)
groupadd --non-unique --gid ${DOCKER_GROUP} dockeronhost
usermod -aG dockeronhost node
mkdir -p /app/cache
chown -R node:node /app/cache
mkdir -p /app/compiles
chown -R node:node /app/compiles
chown -R node:node /app/bin/synctex
mkdir -p /app/test/acceptance/fixtures/tmp/
chown -R node:node /app
chown -R node:node /app/bin
exec runuser -u node -- "$@"

4
install_deps.sh Executable file
View File

@@ -0,0 +1,4 @@
/bin/sh
wget -qO- https://get.docker.com/ | sh
apt-get install poppler-utils vim ghostscript --yes
npm rebuild

41
kube.yaml Normal file
View File

@@ -0,0 +1,41 @@
apiVersion: v1
kind: Service
metadata:
name: clsi
namespace: default
spec:
type: LoadBalancer
ports:
- port: 80
protocol: TCP
targetPort: 80
selector:
run: clsi
---
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: clsi
namespace: default
spec:
replicas: 2
template:
metadata:
labels:
run: clsi
spec:
containers:
- name: clsi
image: gcr.io/henry-terraform-admin/clsi
imagePullPolicy: Always
readinessProbe:
httpGet:
path: status
port: 80
periodSeconds: 5
initialDelaySeconds: 0
failureThreshold: 3
successThreshold: 1

19
nodemon.json Normal file
View File

@@ -0,0 +1,19 @@
{
"ignore": [
".git",
"node_modules/"
],
"verbose": true,
"legacyWatch": true,
"execMap": {
"js": "npm run start"
},
"watch": [
"app/coffee/",
"app.coffee",
"config/"
],
"ext": "coffee"
}

2948
npm-shrinkwrap.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -7,46 +7,48 @@
"url": "https://github.com/sharelatex/clsi-sharelatex.git" "url": "https://github.com/sharelatex/clsi-sharelatex.git"
}, },
"scripts": { "scripts": {
"compile:app": "coffee -o app/js -c app/coffee && coffee -c app.coffee", "compile:app": "([ -e app/coffee ] && coffee -m $COFFEE_OPTIONS -o app/js -c app/coffee || echo 'No CoffeeScript folder to compile') && ( [ -e app.coffee ] && coffee -m $COFFEE_OPTIONS -c app.coffee || echo 'No CoffeeScript app to compile')",
"start": "npm run compile:app && node app.js" "start": "npm run compile:app && node $NODE_APP_OPTIONS app.js",
"test:acceptance:_run": "mocha --recursive --reporter spec --timeout 30000 --exit $@ test/acceptance/js",
"test:acceptance": "npm run compile:app && npm run compile:acceptance_tests && npm run test:acceptance:_run -- --grep=$MOCHA_GREP",
"test:unit:_run": "mocha --recursive --reporter spec --exit $@ test/unit/js",
"test:unit": "npm run compile:app && npm run compile:unit_tests && npm run test:unit:_run -- --grep=$MOCHA_GREP",
"compile:unit_tests": "[ ! -e test/unit/coffee ] && echo 'No unit tests to compile' || coffee -o test/unit/js -c test/unit/coffee",
"compile:acceptance_tests": "[ ! -e test/acceptance/coffee ] && echo 'No acceptance tests to compile' || coffee -o test/acceptance/js -c test/acceptance/coffee",
"compile:all": "npm run compile:app && npm run compile:unit_tests && npm run compile:acceptance_tests && npm run compile:smoke_tests",
"nodemon": "nodemon --config nodemon.json",
"compile:smoke_tests": "[ ! -e test/smoke/coffee ] && echo 'No smoke tests to compile' || coffee -o test/smoke/js -c test/smoke/coffee"
}, },
"author": "James Allen <james@sharelatex.com>", "author": "James Allen <james@sharelatex.com>",
"dependencies": { "dependencies": {
"async": "0.2.9", "async": "0.2.9",
"body-parser": "^1.2.0", "body-parser": "^1.2.0",
"dockerode": "^2.5.3",
"express": "^4.2.0", "express": "^4.2.0",
"fs-extra": "^0.16.3", "fs-extra": "^0.16.3",
"grunt-mkdir": "^1.0.0",
"heapdump": "^0.3.5", "heapdump": "^0.3.5",
"lockfile": "^1.0.3", "lockfile": "^1.0.3",
"logger-sharelatex": "git+https://github.com/sharelatex/logger-sharelatex.git#v1.5.4", "logger-sharelatex": "^1.7.0",
"lynx": "0.0.11", "lynx": "0.0.11",
"metrics-sharelatex": "git+https://github.com/sharelatex/metrics-sharelatex.git#v1.5.0", "metrics-sharelatex": "^2.2.0",
"mkdirp": "0.3.5", "mkdirp": "0.3.5",
"mysql": "2.6.2", "mysql": "2.6.2",
"request": "^2.21.0", "request": "^2.21.0",
"sequelize": "^2.1.3", "sequelize": "^4.38.0",
"settings-sharelatex": "git+https://github.com/sharelatex/settings-sharelatex.git#v1.0.0", "settings-sharelatex": "git+https://github.com/sharelatex/settings-sharelatex.git#v1.1.0",
"smoke-test-sharelatex": "git+https://github.com/sharelatex/smoke-test-sharelatex.git#v0.2.0", "smoke-test-sharelatex": "git+https://github.com/sharelatex/smoke-test-sharelatex.git#v0.2.0",
"sqlite3": "~3.1.8", "sqlite3": "^4.0.6",
"underscore": "^1.8.2", "underscore": "^1.8.2",
"v8-profiler": "^5.2.4", "v8-profiler-node8": "^6.0.1",
"wrench": "~1.5.4" "wrench": "~1.5.4"
}, },
"devDependencies": { "devDependencies": {
"mocha": "1.10.0",
"coffee-script": "1.6.0",
"chai": "~1.8.1",
"sinon": "~1.7.3",
"grunt": "~0.4.2",
"grunt-contrib-coffee": "~0.7.0",
"grunt-contrib-clean": "~0.5.0",
"grunt-shell": "~0.6.1",
"grunt-mocha-test": "~0.8.1",
"sandboxed-module": "~0.3.0",
"timekeeper": "0.0.4",
"grunt-execute": "^0.1.5",
"bunyan": "^0.22.1", "bunyan": "^0.22.1",
"grunt-bunyan": "^0.5.0" "chai": "~1.8.1",
"coffeescript": "1.6.0",
"mocha": "^4.0.1",
"sandboxed-module": "~0.3.0",
"sinon": "~1.7.3",
"timekeeper": "0.0.4"
} }
} }

3
patch-texlive-dockerfile Normal file
View File

@@ -0,0 +1,3 @@
FROM quay.io/sharelatex/texlive-full:2017.1
# RUN usermod -u 1001 tex

836
seccomp/clsi-profile.json Normal file
View File

@@ -0,0 +1,836 @@
{
"defaultAction": "SCMP_ACT_ERRNO",
"architectures": [
"SCMP_ARCH_X86_64",
"SCMP_ARCH_X86",
"SCMP_ARCH_X32"
],
"syscalls": [
{
"name": "access",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "arch_prctl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "brk",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "chdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "chmod",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clock_getres",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clock_gettime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clock_nanosleep",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "clone",
"action": "SCMP_ACT_ALLOW",
"args": [
{
"index": 0,
"value": 2080505856,
"valueTwo": 0,
"op": "SCMP_CMP_MASKED_EQ"
}
]
},
{
"name": "close",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "copy_file_range",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "creat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "dup",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "dup2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "dup3",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "execve",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "execveat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "exit",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "exit_group",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "faccessat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fadvise64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fadvise64_64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fallocate",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchmod",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchmodat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fcntl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fcntl64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fdatasync",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fork",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstatat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstatfs",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fstatfs64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fsync",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "ftruncate",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "ftruncate64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "futex",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "futimesat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getcpu",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getcwd",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getdents",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getdents64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getegid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getegid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "geteuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "geteuid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgroups",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getgroups32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpgrp",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getppid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getpriority",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresgid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getresuid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getrlimit",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "get_robust_list",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getrusage",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getsid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "gettid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "getuid32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "ioctl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "kill",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "_llseek",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "lseek",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "lstat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "lstat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "madvise",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mkdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mkdirat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mmap",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mmap2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mprotect",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "mremap",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "munmap",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "newfstatat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "open",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "openat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pause",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pipe",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pipe2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "prctl",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pread64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "preadv",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "prlimit64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pwrite64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pwritev",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "read",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "readlink",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "readlinkat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "readv",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rename",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "renameat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "renameat2",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "restart_syscall",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rmdir",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigaction",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigpending",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigprocmask",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigqueueinfo",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigreturn",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigsuspend",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_sigtimedwait",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "rt_tgsigqueueinfo",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_getaffinity",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_getparam",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_get_priority_max",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_get_priority_min",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_getscheduler",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_rr_get_interval",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sched_yield",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sendfile",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sendfile64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setgroups",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setgroups32",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "set_robust_list",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "set_tid_address",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sigaltstack",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "stat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "stat64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "statfs",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "statfs64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sync",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sync_file_range",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "syncfs",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "sysinfo",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "tgkill",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_create",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_delete",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_getoverrun",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_gettime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "timer_settime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "times",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "tkill",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "truncate",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "truncate64",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "umask",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "uname",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "unlink",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "unlinkat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "utime",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "utimensat",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "utimes",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "vfork",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "vhangup",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "wait4",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "waitid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "write",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "writev",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "pread",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setgid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "setuid",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "capget",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "capset",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "fchown",
"action": "SCMP_ACT_ALLOW",
"args": []
},
{
"name": "gettimeofday",
"action": "SCMP_ACT_ALLOW",
"args": []
}, {
"name": "epoll_pwait",
"action": "SCMP_ACT_ALLOW",
"args": []
}
]
}

34
synctex.profile Normal file
View File

@@ -0,0 +1,34 @@
include /etc/firejail/disable-common.inc
include /etc/firejail/disable-devel.inc
# include /etc/firejail/disable-mgmt.inc ## removed in 0.9.40
# include /etc/firejail/disable-secret.inc ## removed in 0.9.40
read-only /bin
blacklist /boot
blacklist /dev
read-only /etc
blacklist /home # blacklisted for synctex
read-only /lib
read-only /lib64
blacklist /media
blacklist /mnt
blacklist /opt
blacklist /root
read-only /run
blacklist /sbin
blacklist /selinux
blacklist /src
blacklist /sys
read-only /usr
caps.drop all
noroot
nogroups
net none
private-tmp
private-dev
shell none
seccomp
nonewprivs

View File

@@ -1,9 +1,10 @@
Client = require "./helpers/Client" Client = require "./helpers/Client"
request = require "request" request = require "request"
require("chai").should() require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Broken LaTeX file", -> describe "Broken LaTeX file", ->
before -> before (done)->
@broken_request = @broken_request =
resources: [ resources: [
path: "main.tex" path: "main.tex"
@@ -24,6 +25,7 @@ describe "Broken LaTeX file", ->
\\end{document} \\end{document}
''' '''
] ]
ClsiApp.ensureRunning done
describe "on first run", -> describe "on first run", ->
before (done) -> before (done) ->

View File

@@ -1,9 +1,10 @@
Client = require "./helpers/Client" Client = require "./helpers/Client"
request = require "request" request = require "request"
require("chai").should() require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Deleting Old Files", -> describe "Deleting Old Files", ->
before -> before (done)->
@request = @request =
resources: [ resources: [
path: "main.tex" path: "main.tex"
@@ -14,7 +15,8 @@ describe "Deleting Old Files", ->
\\end{document} \\end{document}
''' '''
] ]
ClsiApp.ensureRunning done
describe "on first run", -> describe "on first run", ->
before (done) -> before (done) ->
@project_id = Client.randomId() @project_id = Client.randomId()

View File

@@ -3,15 +3,25 @@ request = require "request"
require("chai").should() require("chai").should()
fs = require "fs" fs = require "fs"
ChildProcess = require "child_process" ChildProcess = require "child_process"
ClsiApp = require "./helpers/ClsiApp"
fixturePath = (path) -> __dirname + "/../fixtures/" + path logger = require("logger-sharelatex")
Path = require("path")
fixturePath = (path) -> Path.normalize(__dirname + "/../fixtures/" + path)
process = require "process"
console.log process.pid, process.ppid, process.getuid(),process.getgroups(), "PID"
try try
console.log "creating tmp directory", fixturePath("tmp")
fs.mkdirSync(fixturePath("tmp")) fs.mkdirSync(fixturePath("tmp"))
catch e catch err
console.log err, fixturePath("tmp"), "unable to create fixture tmp path"
MOCHA_LATEX_TIMEOUT = 60 * 1000
convertToPng = (pdfPath, pngPath, callback = (error) ->) -> convertToPng = (pdfPath, pngPath, callback = (error) ->) ->
convert = ChildProcess.exec "convert #{fixturePath(pdfPath)} #{fixturePath(pngPath)}" command = "convert #{fixturePath(pdfPath)} #{fixturePath(pngPath)}"
console.log "COMMAND"
console.log command
convert = ChildProcess.exec command
stdout = "" stdout = ""
convert.stdout.on "data", (chunk) -> console.log "STDOUT", chunk.toString() convert.stdout.on "data", (chunk) -> console.log "STDOUT", chunk.toString()
convert.stderr.on "data", (chunk) -> console.log "STDERR", chunk.toString() convert.stderr.on "data", (chunk) -> console.log "STDERR", chunk.toString()
@@ -25,7 +35,10 @@ compare = (originalPath, generatedPath, callback = (error, same) ->) ->
proc.stderr.on "data", (chunk) -> stderr += chunk proc.stderr.on "data", (chunk) -> stderr += chunk
proc.on "exit", () -> proc.on "exit", () ->
if stderr.trim() == "0 (0)" if stderr.trim() == "0 (0)"
fs.unlink diff_file # remove output diff if test matches expected image # remove output diff if test matches expected image
fs.unlink diff_file, (err) ->
if err
throw err
callback null, true callback null, true
else else
console.log "compare result", stderr console.log "compare result", stderr
@@ -40,7 +53,6 @@ checkPdfInfo = (pdfPath, callback = (error, output) ->) ->
if stdout.match(/Optimized:\s+yes/) if stdout.match(/Optimized:\s+yes/)
callback null, true callback null, true
else else
console.log "pdfinfo result", stdout
callback null, false callback null, false
compareMultiplePages = (project_id, callback = (error) ->) -> compareMultiplePages = (project_id, callback = (error) ->) ->
@@ -57,6 +69,8 @@ compareMultiplePages = (project_id, callback = (error) ->) ->
compareNext 0, callback compareNext 0, callback
comparePdf = (project_id, example_dir, callback = (error) ->) -> comparePdf = (project_id, example_dir, callback = (error) ->) ->
console.log "CONVERT"
console.log "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png"
convertToPng "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png", (error) => convertToPng "tmp/#{project_id}.pdf", "tmp/#{project_id}-generated.png", (error) =>
throw error if error? throw error if error?
convertToPng "examples/#{example_dir}/output.pdf", "tmp/#{project_id}-source.png", (error) => convertToPng "examples/#{example_dir}/output.pdf", "tmp/#{project_id}-source.png", (error) =>
@@ -75,6 +89,7 @@ comparePdf = (project_id, example_dir, callback = (error) ->) ->
downloadAndComparePdf = (project_id, example_dir, url, callback = (error) ->) -> downloadAndComparePdf = (project_id, example_dir, url, callback = (error) ->) ->
writeStream = fs.createWriteStream(fixturePath("tmp/#{project_id}.pdf")) writeStream = fs.createWriteStream(fixturePath("tmp/#{project_id}.pdf"))
request.get(url).pipe(writeStream) request.get(url).pipe(writeStream)
console.log("writing file out", fixturePath("tmp/#{project_id}.pdf"))
writeStream.on "close", () => writeStream.on "close", () =>
checkPdfInfo "tmp/#{project_id}.pdf", (error, optimised) => checkPdfInfo "tmp/#{project_id}.pdf", (error, optimised) =>
throw error if error? throw error if error?
@@ -85,7 +100,9 @@ Client.runServer(4242, fixturePath("examples"))
describe "Example Documents", -> describe "Example Documents", ->
before (done) -> before (done) ->
ChildProcess.exec("rm test/acceptance/fixtures/tmp/*").on "exit", () -> done() ChildProcess.exec("rm test/acceptance/fixtures/tmp/*").on "exit", () ->
ClsiApp.ensureRunning done
for example_dir in fs.readdirSync fixturePath("examples") for example_dir in fs.readdirSync fixturePath("examples")
do (example_dir) -> do (example_dir) ->
@@ -94,6 +111,7 @@ describe "Example Documents", ->
@project_id = Client.randomId() + "_" + example_dir @project_id = Client.randomId() + "_" + example_dir
it "should generate the correct pdf", (done) -> it "should generate the correct pdf", (done) ->
this.timeout(MOCHA_LATEX_TIMEOUT)
Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) => Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) =>
if error || body?.compile?.status is "failure" if error || body?.compile?.status is "failure"
console.log "DEBUG: error", error, "body", JSON.stringify(body) console.log "DEBUG: error", error, "body", JSON.stringify(body)
@@ -101,6 +119,7 @@ describe "Example Documents", ->
downloadAndComparePdf(@project_id, example_dir, pdf.url, done) downloadAndComparePdf(@project_id, example_dir, pdf.url, done)
it "should generate the correct pdf on the second run as well", (done) -> it "should generate the correct pdf on the second run as well", (done) ->
this.timeout(MOCHA_LATEX_TIMEOUT)
Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) => Client.compileDirectory @project_id, fixturePath("examples"), example_dir, 4242, (error, res, body) =>
if error || body?.compile?.status is "failure" if error || body?.compile?.status is "failure"
console.log "DEBUG: error", error, "body", JSON.stringify(body) console.log "DEBUG: error", error, "body", JSON.stringify(body)

View File

@@ -1,6 +1,7 @@
Client = require "./helpers/Client" Client = require "./helpers/Client"
request = require "request" request = require "request"
require("chai").should() require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Simple LaTeX file", -> describe "Simple LaTeX file", ->
before (done) -> before (done) ->
@@ -15,7 +16,8 @@ describe "Simple LaTeX file", ->
\\end{document} \\end{document}
''' '''
] ]
Client.compile @project_id, @request, (@error, @res, @body) => done() ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
it "should return the PDF", -> it "should return the PDF", ->
pdf = Client.getOutputFile(@body, "pdf") pdf = Client.getOutputFile(@body, "pdf")

View File

@@ -2,21 +2,25 @@ Client = require "./helpers/Client"
request = require "request" request = require "request"
require("chai").should() require("chai").should()
expect = require("chai").expect expect = require("chai").expect
ClsiApp = require "./helpers/ClsiApp"
crypto = require("crypto")
describe "Syncing", -> describe "Syncing", ->
before (done) -> before (done) ->
@request = content = '''
resources: [
path: "main.tex"
content: '''
\\documentclass{article} \\documentclass{article}
\\begin{document} \\begin{document}
Hello world Hello world
\\end{document} \\end{document}
''' '''
@request =
resources: [
path: "main.tex"
content: content
] ]
@project_id = Client.randomId() @project_id = Client.randomId()
Client.compile @project_id, @request, (@error, @res, @body) => done() ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
describe "from code to pdf", -> describe "from code to pdf", ->
it "should return the correct location", (done) -> it "should return the correct location", (done) ->
@@ -29,7 +33,7 @@ describe "Syncing", ->
describe "from pdf to code", -> describe "from pdf to code", ->
it "should return the correct location", (done) -> it "should return the correct location", (done) ->
Client.syncFromPdf @project_id, 1, 100, 200, (error, codePositions) -> Client.syncFromPdf @project_id, 1, 100, 200, (error, codePositions) =>
throw error if error? throw error if error?
expect(codePositions).to.deep.equal( expect(codePositions).to.deep.equal(
code: [ { file: 'main.tex', line: 3, column: -1 } ] code: [ { file: 'main.tex', line: 3, column: -1 } ]

View File

@@ -1,24 +1,27 @@
Client = require "./helpers/Client" Client = require "./helpers/Client"
request = require "request" request = require "request"
require("chai").should() require("chai").should()
ClsiApp = require "./helpers/ClsiApp"
describe "Timed out compile", -> describe "Timed out compile", ->
before (done) -> before (done) ->
@request = @request =
options: options:
timeout: 1 #seconds timeout: 10 #seconds
resources: [ resources: [
path: "main.tex" path: "main.tex"
content: ''' content: '''
\\documentclass{article} \\documentclass{article}
\\begin{document} \\begin{document}
Hello world \\def\\x{Hello!\\par\\x}
\\input{|"sleep 10"} \\x
\\end{document} \\end{document}
''' '''
] ]
@project_id = Client.randomId() @project_id = Client.randomId()
Client.compile @project_id, @request, (@error, @res, @body) => done() ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
it "should return a timeout error", -> it "should return a timeout error", ->
@body.compile.error.should.equal "container timed out" @body.compile.error.should.equal "container timed out"

View File

@@ -2,6 +2,7 @@ Client = require "./helpers/Client"
request = require "request" request = require "request"
require("chai").should() require("chai").should()
sinon = require "sinon" sinon = require "sinon"
ClsiApp = require "./helpers/ClsiApp"
host = "localhost" host = "localhost"
@@ -46,7 +47,8 @@ describe "Url Caching", ->
}] }]
sinon.spy Server, "getFile" sinon.spy Server, "getFile"
Client.compile @project_id, @request, (@error, @res, @body) => done() ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
afterEach -> afterEach ->
Server.getFile.restore() Server.getFile.restore()

View File

@@ -4,6 +4,7 @@ require("chai").should()
expect = require("chai").expect expect = require("chai").expect
path = require("path") path = require("path")
fs = require("fs") fs = require("fs")
ClsiApp = require "./helpers/ClsiApp"
describe "Syncing", -> describe "Syncing", ->
before (done) -> before (done) ->
@@ -13,7 +14,8 @@ describe "Syncing", ->
content: fs.readFileSync(path.join(__dirname,"../fixtures/naugty_strings.txt"),"utf-8") content: fs.readFileSync(path.join(__dirname,"../fixtures/naugty_strings.txt"),"utf-8")
] ]
@project_id = Client.randomId() @project_id = Client.randomId()
Client.compile @project_id, @request, (@error, @res, @body) => done() ClsiApp.ensureRunning =>
Client.compile @project_id, @request, (@error, @res, @body) => done()
describe "wordcount file", -> describe "wordcount file", ->
it "should return wordcount info", (done) -> it "should return wordcount info", (done) ->

View File

@@ -30,6 +30,7 @@ module.exports = Client =
express = require("express") express = require("express")
app = express() app = express()
app.use express.static(directory) app.use express.static(directory)
console.log("starting test server on", port, host)
app.listen(port, host).on "error", (error) -> app.listen(port, host).on "error", (error) ->
console.error "error starting server:", error.message console.error "error starting server:", error.message
process.exit(1) process.exit(1)

View File

@@ -0,0 +1,24 @@
app = require('../../../../app')
require("logger-sharelatex").logger.level("info")
logger = require("logger-sharelatex")
Settings = require("settings-sharelatex")
module.exports =
running: false
initing: false
callbacks: []
ensureRunning: (callback = (error) ->) ->
if @running
return callback()
else if @initing
@callbacks.push callback
else
@initing = true
@callbacks.push callback
app.listen Settings.internal?.clsi?.port, "localhost", (error) =>
throw error if error?
@running = true
logger.log("clsi running in dev mode")
for callback in @callbacks
callback()

View File

@@ -14,7 +14,7 @@ describe "CompileController", ->
clsi: clsi:
url: "http://clsi.example.com" url: "http://clsi.example.com"
"./ProjectPersistenceManager": @ProjectPersistenceManager = {} "./ProjectPersistenceManager": @ProjectPersistenceManager = {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() } "logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub(), err:sinon.stub(), warn: sinon.stub()}
@Settings.externalUrl = "http://www.example.com" @Settings.externalUrl = "http://www.example.com"
@req = {} @req = {}
@res = {} @res = {}
@@ -144,7 +144,7 @@ describe "CompileController", ->
file: @file file: @file
line: @line.toString() line: @line.toString()
column: @column.toString() column: @column.toString()
@res.send = sinon.stub() @res.json = sinon.stub()
@CompileManager.syncFromCode = sinon.stub().callsArgWith(5, null, @pdfPositions = ["mock-positions"]) @CompileManager.syncFromCode = sinon.stub().callsArgWith(5, null, @pdfPositions = ["mock-positions"])
@CompileController.syncFromCode @req, @res, @next @CompileController.syncFromCode @req, @res, @next
@@ -155,8 +155,8 @@ describe "CompileController", ->
.should.equal true .should.equal true
it "should return the positions", -> it "should return the positions", ->
@res.send @res.json
.calledWith(JSON.stringify .calledWith(
pdf: @pdfPositions pdf: @pdfPositions
) )
.should.equal true .should.equal true
@@ -173,7 +173,7 @@ describe "CompileController", ->
page: @page.toString() page: @page.toString()
h: @h.toString() h: @h.toString()
v: @v.toString() v: @v.toString()
@res.send = sinon.stub() @res.json = sinon.stub()
@CompileManager.syncFromPdf = sinon.stub().callsArgWith(5, null, @codePositions = ["mock-positions"]) @CompileManager.syncFromPdf = sinon.stub().callsArgWith(5, null, @codePositions = ["mock-positions"])
@CompileController.syncFromPdf @req, @res, @next @CompileController.syncFromPdf @req, @res, @next
@@ -184,8 +184,8 @@ describe "CompileController", ->
.should.equal true .should.equal true
it "should return the positions", -> it "should return the positions", ->
@res.send @res.json
.calledWith(JSON.stringify .calledWith(
code: @codePositions code: @codePositions
) )
.should.equal true .should.equal true
@@ -199,7 +199,7 @@ describe "CompileController", ->
@req.query = @req.query =
file: @file file: @file
image: @image = "example.com/image" image: @image = "example.com/image"
@res.send = sinon.stub() @res.json = sinon.stub()
@CompileManager.wordcount = sinon.stub().callsArgWith(4, null, @texcount = ["mock-texcount"]) @CompileManager.wordcount = sinon.stub().callsArgWith(4, null, @texcount = ["mock-texcount"])
@CompileController.wordcount @req, @res, @next @CompileController.wordcount @req, @res, @next
@@ -210,8 +210,8 @@ describe "CompileController", ->
.should.equal true .should.equal true
it "should return the texcount info", -> it "should return the texcount info", ->
@res.send @res.json
.calledWith(JSON.stringify .calledWith(
texcount: @texcount texcount: @texcount
) )
.should.equal true .should.equal true

View File

@@ -13,7 +13,14 @@ describe "CompileManager", ->
"./ResourceWriter": @ResourceWriter = {} "./ResourceWriter": @ResourceWriter = {}
"./OutputFileFinder": @OutputFileFinder = {} "./OutputFileFinder": @OutputFileFinder = {}
"./OutputCacheManager": @OutputCacheManager = {} "./OutputCacheManager": @OutputCacheManager = {}
"settings-sharelatex": @Settings = { path: compilesDir: "/compiles/dir" } "settings-sharelatex": @Settings =
path:
compilesDir: "/compiles/dir"
synctexBaseDir: -> "/compile"
clsi:
docker:
image: "SOMEIMAGE"
"logger-sharelatex": @logger = { log: sinon.stub() , info:->} "logger-sharelatex": @logger = { log: sinon.stub() , info:->}
"child_process": @child_process = {} "child_process": @child_process = {}
"./CommandRunner": @CommandRunner = {} "./CommandRunner": @CommandRunner = {}
@@ -23,13 +30,14 @@ describe "CompileManager", ->
"fs": @fs = {} "fs": @fs = {}
"fs-extra": @fse = { ensureDir: sinon.stub().callsArg(1) } "fs-extra": @fse = { ensureDir: sinon.stub().callsArg(1) }
@callback = sinon.stub() @callback = sinon.stub()
@project_id = "project-id-123"
@user_id = "1234"
describe "doCompileWithLock", -> describe "doCompileWithLock", ->
beforeEach -> beforeEach ->
@request = @request =
resources: @resources = "mock-resources" resources: @resources = "mock-resources"
project_id: @project_id = "project-id-123" project_id: @project_id
user_id: @user_id = "1234" user_id: @user_id
@output_files = ["foo", "bar"] @output_files = ["foo", "bar"]
@Settings.compileDir = "compiles" @Settings.compileDir = "compiles"
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}" @compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@@ -95,11 +103,12 @@ describe "CompileManager", ->
@request = @request =
resources: @resources = "mock-resources" resources: @resources = "mock-resources"
rootResourcePath: @rootResourcePath = "main.tex" rootResourcePath: @rootResourcePath = "main.tex"
project_id: @project_id = "project-id-123" project_id: @project_id
user_id: @user_id = "1234" user_id: @user_id
compiler: @compiler = "pdflatex" compiler: @compiler = "pdflatex"
timeout: @timeout = 42000 timeout: @timeout = 42000
imageName: @image = "example.com/image" imageName: @image = "example.com/image"
flags: @flags = ["-file-line-error"]
@env = {} @env = {}
@Settings.compileDir = "compiles" @Settings.compileDir = "compiles"
@compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}" @compileDir = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@@ -109,7 +118,7 @@ describe "CompileManager", ->
@OutputCacheManager.saveOutputFiles = sinon.stub().callsArgWith(2, null, @build_files) @OutputCacheManager.saveOutputFiles = sinon.stub().callsArgWith(2, null, @build_files)
@DraftModeManager.injectDraftMode = sinon.stub().callsArg(1) @DraftModeManager.injectDraftMode = sinon.stub().callsArg(1)
@TikzManager.checkMainFile = sinon.stub().callsArg(3, false) @TikzManager.checkMainFile = sinon.stub().callsArg(3, false)
describe "normally", -> describe "normally", ->
beforeEach -> beforeEach ->
@CompileManager.doCompile @request, @callback @CompileManager.doCompile @request, @callback
@@ -127,6 +136,7 @@ describe "CompileManager", ->
compiler: @compiler compiler: @compiler
timeout: @timeout timeout: @timeout
image: @image image: @image
flags: @flags
environment: @env environment: @env
}) })
.should.equal true .should.equal true
@@ -138,15 +148,15 @@ describe "CompileManager", ->
it "should return the output files", -> it "should return the output files", ->
@callback.calledWith(null, @build_files).should.equal true @callback.calledWith(null, @build_files).should.equal true
it "should not inject draft mode by default", -> it "should not inject draft mode by default", ->
@DraftModeManager.injectDraftMode.called.should.equal false @DraftModeManager.injectDraftMode.called.should.equal false
describe "with draft mode", -> describe "with draft mode", ->
beforeEach -> beforeEach ->
@request.draft = true @request.draft = true
@CompileManager.doCompile @request, @callback @CompileManager.doCompile @request, @callback
it "should inject the draft mode header", -> it "should inject the draft mode header", ->
@DraftModeManager.injectDraftMode @DraftModeManager.injectDraftMode
.calledWith(@compileDir + "/" + @rootResourcePath) .calledWith(@compileDir + "/" + @rootResourcePath)
@@ -165,6 +175,7 @@ describe "CompileManager", ->
compiler: @compiler compiler: @compiler
timeout: @timeout timeout: @timeout
image: @image image: @image
flags: @flags
environment: {'CHKTEX_OPTIONS': '-nall -e9 -e10 -w15 -w16', 'CHKTEX_EXIT_ON_ERROR':1, 'CHKTEX_ULIMIT_OPTIONS': '-t 5 -v 64000'} environment: {'CHKTEX_OPTIONS': '-nall -e9 -e10 -w15 -w16', 'CHKTEX_EXIT_ON_ERROR':1, 'CHKTEX_ULIMIT_OPTIONS': '-t 5 -v 64000'}
}) })
.should.equal true .should.equal true
@@ -183,6 +194,7 @@ describe "CompileManager", ->
compiler: @compiler compiler: @compiler
timeout: @timeout timeout: @timeout
image: @image image: @image
flags: @flags
environment: @env environment: @env
}) })
.should.equal true .should.equal true
@@ -247,16 +259,23 @@ describe "CompileManager", ->
describe "syncFromCode", -> describe "syncFromCode", ->
beforeEach -> beforeEach ->
@fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true}) @fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true})
@child_process.execFile.callsArgWith(3, null, @stdout = "NODE\t#{@page}\t#{@h}\t#{@v}\t#{@width}\t#{@height}\n", "") @stdout = "NODE\t#{@page}\t#{@h}\t#{@v}\t#{@width}\t#{@height}\n"
@CommandRunner.run = sinon.stub().callsArgWith(6, null, {stdout:@stdout})
@CompileManager.syncFromCode @project_id, @user_id, @file_name, @line, @column, @callback @CompileManager.syncFromCode @project_id, @user_id, @file_name, @line, @column, @callback
it "should execute the synctex binary", -> it "should execute the synctex binary", ->
bin_path = Path.resolve(__dirname + "/../../../bin/synctex") bin_path = Path.resolve(__dirname + "/../../../bin/synctex")
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf" synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf"
file_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}" file_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}"
@child_process.execFile @CommandRunner.run
.calledWith(bin_path, ["code", synctex_path, file_path, @line, @column], timeout: 10000) .calledWith(
.should.equal true "#{@project_id}-#{@user_id}",
['/opt/synctex', 'code', synctex_path, file_path, @line, @column],
"#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}",
@Settings.clsi.docker.image,
60000,
{}
).should.equal true
it "should call the callback with the parsed output", -> it "should call the callback with the parsed output", ->
@callback @callback
@@ -272,15 +291,21 @@ describe "CompileManager", ->
describe "syncFromPdf", -> describe "syncFromPdf", ->
beforeEach -> beforeEach ->
@fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true}) @fs.stat = sinon.stub().callsArgWith(1, null,{isFile: ()->true})
@child_process.execFile.callsArgWith(3, null, @stdout = "NODE\t#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}\t#{@line}\t#{@column}\n", "") @stdout = "NODE\t#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/#{@file_name}\t#{@line}\t#{@column}\n"
@CommandRunner.run = sinon.stub().callsArgWith(6, null, {stdout:@stdout})
@CompileManager.syncFromPdf @project_id, @user_id, @page, @h, @v, @callback @CompileManager.syncFromPdf @project_id, @user_id, @page, @h, @v, @callback
it "should execute the synctex binary", -> it "should execute the synctex binary", ->
bin_path = Path.resolve(__dirname + "/../../../bin/synctex") bin_path = Path.resolve(__dirname + "/../../../bin/synctex")
synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf" synctex_path = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}/output.pdf"
@child_process.execFile @CommandRunner.run
.calledWith(bin_path, ["pdf", synctex_path, @page, @h, @v], timeout: 10000) .calledWith(
.should.equal true "#{@project_id}-#{@user_id}",
['/opt/synctex', "pdf", synctex_path, @page, @h, @v],
"#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}",
@Settings.clsi.docker.image,
60000,
{}).should.equal true
it "should call the callback with the parsed output", -> it "should call the callback with the parsed output", ->
@callback @callback
@@ -297,8 +322,8 @@ describe "CompileManager", ->
@fs.readFile = sinon.stub().callsArgWith(2, null, @stdout = "Encoding: ascii\nWords in text: 2") @fs.readFile = sinon.stub().callsArgWith(2, null, @stdout = "Encoding: ascii\nWords in text: 2")
@callback = sinon.stub() @callback = sinon.stub()
@project_id = "project-id-123" @project_id
@timeout = 10 * 1000 @timeout = 60 * 1000
@file_name = "main.tex" @file_name = "main.tex"
@Settings.path.compilesDir = "/local/compile/directory" @Settings.path.compilesDir = "/local/compile/directory"
@image = "example.com/image" @image = "example.com/image"
@@ -309,7 +334,7 @@ describe "CompileManager", ->
@directory = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}" @directory = "#{@Settings.path.compilesDir}/#{@project_id}-#{@user_id}"
@file_path = "$COMPILE_DIR/#{@file_name}" @file_path = "$COMPILE_DIR/#{@file_name}"
@command =[ "texcount", "-nocol", "-inc", @file_path, "-out=" + @file_path + ".wc"] @command =[ "texcount", "-nocol", "-inc", @file_path, "-out=" + @file_path + ".wc"]
@CommandRunner.run @CommandRunner.run
.calledWith("#{@project_id}-#{@user_id}", @command, @directory, @image, @timeout, {}) .calledWith("#{@project_id}-#{@user_id}", @command, @directory, @image, @timeout, {})
.should.equal true .should.equal true

View File

@@ -0,0 +1,145 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
require "coffee-script"
modulePath = require('path').join __dirname, '../../../app/coffee/DockerLockManager'
describe "LockManager", ->
beforeEach ->
@LockManager = SandboxedModule.require modulePath, requires:
"settings-sharelatex": @Settings =
clsi: docker: {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() }
describe "runWithLock", ->
describe "with a single lock", ->
beforeEach (done) ->
@callback = sinon.stub()
@LockManager.runWithLock "lock-one", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world")
, 100
, (err, args...) =>
@callback(err,args...)
done()
it "should call the callback", ->
@callback.calledWith(null,"hello","world").should.equal true
describe "with two locks", ->
beforeEach (done) ->
@callback1 = sinon.stub()
@callback2 = sinon.stub()
@LockManager.runWithLock "lock-one", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 100
, (err, args...) =>
@callback1(err,args...)
@LockManager.runWithLock "lock-two", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 200
, (err, args...) =>
@callback2(err,args...)
done()
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback", ->
@callback2.calledWith(null,"hello","world","two").should.equal true
describe "with lock contention", ->
describe "where the first lock is released quickly", ->
beforeEach (done) ->
@LockManager.MAX_LOCK_WAIT_TIME = 1000
@LockManager.LOCK_TEST_INTERVAL = 100
@callback1 = sinon.stub()
@callback2 = sinon.stub()
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 100
, (err, args...) =>
@callback1(err,args...)
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 200
, (err, args...) =>
@callback2(err,args...)
done()
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback", ->
@callback2.calledWith(null,"hello","world","two").should.equal true
describe "where the first lock is held longer than the waiting time", ->
beforeEach (done) ->
@LockManager.MAX_LOCK_HOLD_TIME = 10000
@LockManager.MAX_LOCK_WAIT_TIME = 1000
@LockManager.LOCK_TEST_INTERVAL = 100
@callback1 = sinon.stub()
@callback2 = sinon.stub()
doneOne = doneTwo = false
finish = (key) ->
doneOne = true if key is 1
doneTwo = true if key is 2
done() if doneOne and doneTwo
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 1100
, (err, args...) =>
@callback1(err,args...)
finish(1)
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 100
, (err, args...) =>
@callback2(err,args...)
finish(2)
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback with an error", ->
error = sinon.match.instanceOf Error
@callback2.calledWith(error).should.equal true
describe "where the first lock is held longer than the max holding time", ->
beforeEach (done) ->
@LockManager.MAX_LOCK_HOLD_TIME = 1000
@LockManager.MAX_LOCK_WAIT_TIME = 2000
@LockManager.LOCK_TEST_INTERVAL = 100
@callback1 = sinon.stub()
@callback2 = sinon.stub()
doneOne = doneTwo = false
finish = (key) ->
doneOne = true if key is 1
doneTwo = true if key is 2
done() if doneOne and doneTwo
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","one")
, 1500
, (err, args...) =>
@callback1(err,args...)
finish(1)
@LockManager.runWithLock "lock", (releaseLock) ->
setTimeout () ->
releaseLock(null, "hello", "world","two")
, 100
, (err, args...) =>
@callback2(err,args...)
finish(2)
it "should call the first callback", ->
@callback1.calledWith(null,"hello","world","one").should.equal true
it "should call the second callback", ->
@callback2.calledWith(null,"hello","world","two").should.equal true

View File

@@ -0,0 +1,509 @@
SandboxedModule = require('sandboxed-module')
sinon = require('sinon')
require('chai').should()
expect = require('chai').expect
require "coffee-script"
modulePath = require('path').join __dirname, '../../../app/coffee/DockerRunner'
Path = require "path"
describe "DockerRunner", ->
beforeEach ->
@container = container = {}
@DockerRunner = SandboxedModule.require modulePath, requires:
"settings-sharelatex": @Settings =
clsi: docker: {}
path: {}
"logger-sharelatex": @logger = {
log: sinon.stub(),
error: sinon.stub(),
info: sinon.stub(),
warn: sinon.stub()
}
"dockerode": class Docker
getContainer: sinon.stub().returns(container)
createContainer: sinon.stub().yields(null, container)
listContainers: sinon.stub()
"fs": @fs = { stat: sinon.stub().yields(null,{isDirectory:()->true}) }
"./Metrics":
Timer: class Timer
done: () ->
"./LockManager":
runWithLock: (key, runner, callback) -> runner(callback)
@Docker = Docker
@getContainer = Docker::getContainer
@createContainer = Docker::createContainer
@listContainers = Docker::listContainers
@directory = "/local/compile/directory"
@mainFile = "main-file.tex"
@compiler = "pdflatex"
@image = "example.com/sharelatex/image:2016.2"
@env = {}
@callback = sinon.stub()
@project_id = "project-id-123"
@volumes =
"/local/compile/directory": "/compile"
@Settings.clsi.docker.image = @defaultImage = "default-image"
@Settings.clsi.docker.env = PATH: "mock-path"
describe "run", ->
beforeEach (done)->
@DockerRunner._getContainerOptions = sinon.stub().returns(@options = {mockoptions: "foo"})
@DockerRunner._fingerprintContainer = sinon.stub().returns(@fingerprint = "fingerprint")
@name = "project-#{@project_id}-#{@fingerprint}"
@command = ["mock", "command", "--outdir=$COMPILE_DIR"]
@command_with_dir = ["mock", "command", "--outdir=/compile"]
@timeout = 42000
done()
describe "successfully", ->
beforeEach (done)->
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, (err, output)=>
@callback(err, output)
done()
it "should generate the options for the container", ->
@DockerRunner._getContainerOptions
.calledWith(@command_with_dir, @image, @volumes, @timeout)
.should.equal true
it "should generate the fingerprint from the returned options", ->
@DockerRunner._fingerprintContainer
.calledWith(@options)
.should.equal true
it "should do the run", ->
@DockerRunner._runAndWaitForContainer
.calledWith(@options, @volumes, @timeout)
.should.equal true
it "should call the callback", ->
@callback.calledWith(null, @output).should.equal true
describe 'when path.sandboxedCompilesHostDir is set', ->
beforeEach ->
@Settings.path.sandboxedCompilesHostDir = '/some/host/dir/compiles'
@directory = '/var/lib/sharelatex/data/compiles/xyz'
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, @callback
it 'should re-write the bind directory', ->
volumes = @DockerRunner._runAndWaitForContainer.lastCall.args[1]
expect(volumes).to.deep.equal {
'/some/host/dir/compiles/xyz': '/compile'
}
it "should call the callback", ->
@callback.calledWith(null, @output).should.equal true
describe "when the run throws an error", ->
beforeEach ->
firstTime = true
@output = "mock-output"
@DockerRunner._runAndWaitForContainer = (options, volumes, timeout, callback = (error, output)->) =>
if firstTime
firstTime = false
callback new Error("HTTP code is 500 which indicates error: server error")
else
callback(null, @output)
sinon.spy @DockerRunner, "_runAndWaitForContainer"
@DockerRunner.destroyContainer = sinon.stub().callsArg(3)
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, @callback
it "should do the run twice", ->
@DockerRunner._runAndWaitForContainer
.calledTwice.should.equal true
it "should destroy the container in between", ->
@DockerRunner.destroyContainer
.calledWith(@name, null)
.should.equal true
it "should call the callback", ->
@callback.calledWith(null, @output).should.equal true
describe "with no image", ->
beforeEach ->
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, null, @timeout, @env, @callback
it "should use the default image", ->
@DockerRunner._getContainerOptions
.calledWith(@command_with_dir, @defaultImage, @volumes, @timeout)
.should.equal true
describe "with image override", ->
beforeEach ->
@Settings.texliveImageNameOveride = "overrideimage.com/something"
@DockerRunner._runAndWaitForContainer = sinon.stub().callsArgWith(3, null, @output = "mock-output")
@DockerRunner.run @project_id, @command, @directory, @image, @timeout, @env, @callback
it "should use the override and keep the tag", ->
image = @DockerRunner._getContainerOptions.args[0][1]
image.should.equal "overrideimage.com/something/image:2016.2"
describe "_runAndWaitForContainer", ->
beforeEach ->
@options = {mockoptions: "foo", name: @name = "mock-name"}
@DockerRunner.startContainer = (options, volumes, attachStreamHandler, callback) =>
attachStreamHandler(null, @output = "mock-output")
callback(null, @containerId = "container-id")
sinon.spy @DockerRunner, "startContainer"
@DockerRunner.waitForContainer = sinon.stub().callsArgWith(2, null, @exitCode = 42)
@DockerRunner._runAndWaitForContainer @options, @volumes, @timeout, @callback
it "should create/start the container", ->
@DockerRunner.startContainer
.calledWith(@options, @volumes)
.should.equal true
it "should wait for the container to finish", ->
@DockerRunner.waitForContainer
.calledWith(@name, @timeout)
.should.equal true
it "should call the callback with the output", ->
@callback.calledWith(null, @output).should.equal true
describe "startContainer", ->
beforeEach ->
@attachStreamHandler = sinon.stub()
@attachStreamHandler.cock = true
@options = {mockoptions: "foo", name: "mock-name"}
@container.inspect = sinon.stub().callsArgWith(0)
@DockerRunner.attachToContainer = (containerId, attachStreamHandler, cb)=>
attachStreamHandler()
cb()
sinon.spy @DockerRunner, "attachToContainer"
describe "when the container exists", ->
beforeEach ->
@container.inspect = sinon.stub().callsArgWith(0)
@container.start = sinon.stub().yields()
@DockerRunner.startContainer @options, @volumes, @callback, ->
it "should start the container with the given name", ->
@getContainer
.calledWith(@options.name)
.should.equal true
@container.start
.called
.should.equal true
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should attach to the container", ->
@DockerRunner.attachToContainer.called.should.equal true
it "should call the callback", ->
@callback.called.should.equal true
it "should attach before the container starts", ->
sinon.assert.callOrder(@DockerRunner.attachToContainer, @container.start)
describe "when the container does not exist", ->
beforeEach ()->
exists = false
@container.start = sinon.stub().yields()
@container.inspect = sinon.stub().callsArgWith(0, {statusCode:404})
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should create the container", ->
@createContainer
.calledWith(@options)
.should.equal true
it "should call the callback and stream handler", ->
@attachStreamHandler.called.should.equal true
@callback.called.should.equal true
it "should attach to the container", ->
@DockerRunner.attachToContainer.called.should.equal true
it "should attach before the container starts", ->
sinon.assert.callOrder(@DockerRunner.attachToContainer, @container.start)
describe "when the container is already running", ->
beforeEach ->
error = new Error("HTTP code is 304 which indicates error: server error - start: Cannot start container #{@name}: The container MOCKID is already running.")
error.statusCode = 304
@container.start = sinon.stub().yields(error)
@container.inspect = sinon.stub().callsArgWith(0)
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback and stream handler without an error", ->
@attachStreamHandler.called.should.equal true
@callback.called.should.equal true
describe "when a volume does not exist", ->
beforeEach ()->
@fs.stat = sinon.stub().yields(new Error("no such path"))
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback with an error", ->
@callback.calledWith(new Error()).should.equal true
describe "when a volume exists but is not a directory", ->
beforeEach ->
@fs.stat = sinon.stub().yields(null, {isDirectory: () -> return false})
@DockerRunner.startContainer @options, @volumes, @attachStreamHandler, @callback
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback with an error", ->
@callback.calledWith(new Error()).should.equal true
describe "when a volume does not exist, but sibling-containers are used", ->
beforeEach ->
@fs.stat = sinon.stub().yields(new Error("no such path"))
@Settings.path.sandboxedCompilesHostDir = '/some/path'
@container.start = sinon.stub().yields()
@DockerRunner.startContainer @options, @volumes, @callback
afterEach ->
delete @Settings.path.sandboxedCompilesHostDir
it "should start the container with the given name", ->
@getContainer
.calledWith(@options.name)
.should.equal true
@container.start
.called
.should.equal true
it "should not try to create the container", ->
@createContainer.called.should.equal false
it "should call the callback", ->
@callback.called.should.equal true
@callback.calledWith(new Error()).should.equal false
describe "when the container tries to be created, but already has been (race condition)", ->
describe "waitForContainer", ->
beforeEach ->
@containerId = "container-id"
@timeout = 5000
@container.wait = sinon.stub().yields(null, StatusCode: @statusCode = 42)
@container.kill = sinon.stub().yields()
describe "when the container returns in time", ->
beforeEach ->
@DockerRunner.waitForContainer @containerId, @timeout, @callback
it "should wait for the container", ->
@getContainer
.calledWith(@containerId)
.should.equal true
@container.wait
.called
.should.equal true
it "should call the callback with the exit", ->
@callback
.calledWith(null, @statusCode)
.should.equal true
describe "when the container does not return before the timeout", ->
beforeEach (done) ->
@container.wait = (callback = (error, exitCode) ->) ->
setTimeout () ->
callback(null, StatusCode: 42)
, 100
@timeout = 5
@DockerRunner.waitForContainer @containerId, @timeout, (args...) =>
@callback(args...)
done()
it "should call kill on the container", ->
@getContainer
.calledWith(@containerId)
.should.equal true
@container.kill
.called
.should.equal true
it "should call the callback with an error", ->
error = new Error("container timed out")
error.timedout = true
@callback
.calledWith(error)
.should.equal true
describe "destroyOldContainers", ->
beforeEach (done) ->
oneHourInSeconds = 60 * 60
oneHourInMilliseconds = oneHourInSeconds * 1000
nowInSeconds = Date.now()/1000
@containers = [{
Name: "/project-old-container-name"
Id: "old-container-id"
Created: nowInSeconds - oneHourInSeconds - 100
}, {
Name: "/project-new-container-name"
Id: "new-container-id"
Created: nowInSeconds - oneHourInSeconds + 100
}, {
Name: "/totally-not-a-project-container"
Id: "some-random-id"
Created: nowInSeconds - (2 * oneHourInSeconds )
}]
@DockerRunner.MAX_CONTAINER_AGE = oneHourInMilliseconds
@listContainers.callsArgWith(1, null, @containers)
@DockerRunner.destroyContainer = sinon.stub().callsArg(3)
@DockerRunner.destroyOldContainers (error) =>
@callback(error)
done()
it "should list all containers", ->
@listContainers
.calledWith(all: true)
.should.equal true
it "should destroy old containers", ->
@DockerRunner.destroyContainer
.callCount
.should.equal 1
@DockerRunner.destroyContainer
.calledWith("/project-old-container-name", "old-container-id")
.should.equal true
it "should not destroy new containers", ->
@DockerRunner.destroyContainer
.calledWith("/project-new-container-name", "new-container-id")
.should.equal false
it "should not destroy non-project containers", ->
@DockerRunner.destroyContainer
.calledWith("/totally-not-a-project-container", "some-random-id")
.should.equal false
it "should callback the callback", ->
@callback.called.should.equal true
describe '_destroyContainer', ->
beforeEach ->
@containerId = 'some_id'
@fakeContainer =
remove: sinon.stub().callsArgWith(1, null)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should get the container', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
@Docker::getContainer.callCount.should.equal 1
@Docker::getContainer.calledWith(@containerId).should.equal true
done()
it 'should try to force-destroy the container when shouldForce=true', (done) ->
@DockerRunner._destroyContainer @containerId, true, (err) =>
@fakeContainer.remove.callCount.should.equal 1
@fakeContainer.remove.calledWith({force: true}).should.equal true
done()
it 'should not try to force-destroy the container when shouldForce=false', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
@fakeContainer.remove.callCount.should.equal 1
@fakeContainer.remove.calledWith({force: false}).should.equal true
done()
it 'should not produce an error', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
expect(err).to.equal null
done()
describe 'when the container is already gone', ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 404
@fakeContainer =
remove: sinon.stub().callsArgWith(1, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should not produce an error', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
expect(err).to.equal null
done()
describe 'when container.destroy produces an error', (done) ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 500
@fakeContainer =
remove: sinon.stub().callsArgWith(1, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should produce an error', (done) ->
@DockerRunner._destroyContainer @containerId, false, (err) =>
expect(err).to.not.equal null
expect(err).to.equal @fakeError
done()
describe 'kill', ->
beforeEach ->
@containerId = 'some_id'
@fakeContainer =
kill: sinon.stub().callsArgWith(0, null)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should get the container', (done) ->
@DockerRunner.kill @containerId, (err) =>
@Docker::getContainer.callCount.should.equal 1
@Docker::getContainer.calledWith(@containerId).should.equal true
done()
it 'should try to force-destroy the container', (done) ->
@DockerRunner.kill @containerId, (err) =>
@fakeContainer.kill.callCount.should.equal 1
done()
it 'should not produce an error', (done) ->
@DockerRunner.kill @containerId, (err) =>
expect(err).to.equal undefined
done()
describe 'when the container is not actually running', ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 500
@fakeError.message = 'Cannot kill container <whatever> is not running'
@fakeContainer =
kill: sinon.stub().callsArgWith(0, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should not produce an error', (done) ->
@DockerRunner.kill @containerId, (err) =>
expect(err).to.equal undefined
done()
describe 'when container.kill produces a legitimate error', (done) ->
beforeEach ->
@fakeError = new Error('woops')
@fakeError.statusCode = 500
@fakeError.message = 'Totally legitimate reason to throw an error'
@fakeContainer =
kill: sinon.stub().callsArgWith(0, @fakeError)
@Docker::getContainer = sinon.stub().returns(@fakeContainer)
it 'should produce an error', (done) ->
@DockerRunner.kill @containerId, (err) =>
expect(err).to.not.equal undefined
expect(err).to.equal @fakeError
done()

View File

@@ -59,3 +59,21 @@ describe "LatexRunner", ->
mainFile = command.slice(-1)[0] mainFile = command.slice(-1)[0]
mainFile.should.equal "$COMPILE_DIR/main-file.tex" mainFile.should.equal "$COMPILE_DIR/main-file.tex"
describe "with a flags option", ->
beforeEach ->
@LatexRunner.runLatex @project_id,
directory: @directory
mainFile: @mainFile
compiler: @compiler
image: @image
timeout: @timeout = 42000
flags: ["-file-line-error", "-halt-on-error"]
@callback
it "should include the flags in the command", ->
command = @CommandRunner.run.args[0][1]
flags = command.filter (arg) ->
(arg == "-file-line-error") || (arg == "-halt-on-error")
flags.length.should.equal 2
flags[0].should.equal "-file-line-error"
flags[1].should.equal "-halt-on-error"

View File

@@ -5,11 +5,14 @@ modulePath = require('path').join __dirname, '../../../app/js/LockManager'
Path = require "path" Path = require "path"
Errors = require "../../../app/js/Errors" Errors = require "../../../app/js/Errors"
describe "LockManager", -> describe "DockerLockManager", ->
beforeEach -> beforeEach ->
@LockManager = SandboxedModule.require modulePath, requires: @LockManager = SandboxedModule.require modulePath, requires:
"settings-sharelatex": {} "settings-sharelatex": {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() } "logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub(), err:-> }
"fs":
lstat:sinon.stub().callsArgWith(1)
readdir: sinon.stub().callsArgWith(1)
"lockfile": @Lockfile = {} "lockfile": @Lockfile = {}
@lockFile = "/local/compile/directory/.project-lock" @lockFile = "/local/compile/directory/.project-lock"

View File

@@ -1,6 +1,7 @@
SandboxedModule = require('sandboxed-module') SandboxedModule = require('sandboxed-module')
sinon = require('sinon') sinon = require('sinon')
require('chai').should() require('chai').should()
expect = require('chai').expect
modulePath = require('path').join __dirname, '../../../app/js/RequestParser' modulePath = require('path').join __dirname, '../../../app/js/RequestParser'
tk = require("timekeeper") tk = require("timekeeper")
@@ -16,11 +17,13 @@ describe "RequestParser", ->
compile: compile:
token: "token-123" token: "token-123"
options: options:
imageName: "basicImageName/here:2017-1"
compiler: "pdflatex" compiler: "pdflatex"
timeout: 42 timeout: 42
resources: [] resources: []
@RequestParser = SandboxedModule.require modulePath @RequestParser = SandboxedModule.require modulePath, requires:
"settings-sharelatex": @settings = {}
afterEach -> afterEach ->
tk.reset() tk.reset()
@@ -53,10 +56,32 @@ describe "RequestParser", ->
beforeEach -> beforeEach ->
delete @validRequest.compile.options.compiler delete @validRequest.compile.options.compiler
@RequestParser.parse @validRequest, (error, @data) => @RequestParser.parse @validRequest, (error, @data) =>
it "should set the compiler to pdflatex by default", -> it "should set the compiler to pdflatex by default", ->
@data.compiler.should.equal "pdflatex" @data.compiler.should.equal "pdflatex"
describe "with imageName set", ->
beforeEach ->
@RequestParser.parse @validRequest, (error, @data) =>
it "should set the imageName", ->
@data.imageName.should.equal "basicImageName/here:2017-1"
describe "with flags set", ->
beforeEach ->
@validRequest.compile.options.flags = ["-file-line-error"]
@RequestParser.parse @validRequest, (error, @data) =>
it "should set the flags attribute", ->
expect(@data.flags).to.deep.equal ["-file-line-error"]
describe "with flags not specified", ->
beforeEach ->
@RequestParser.parse @validRequest, (error, @data) =>
it "it should have an empty flags list", ->
expect(@data.flags).to.deep.equal []
describe "without a timeout specified", -> describe "without a timeout specified", ->
beforeEach -> beforeEach ->
delete @validRequest.compile.options.timeout delete @validRequest.compile.options.timeout
@@ -79,7 +104,7 @@ describe "RequestParser", ->
it "should set the timeout (in milliseconds)", -> it "should set the timeout (in milliseconds)", ->
@data.timeout.should.equal @validRequest.compile.options.timeout * 1000 @data.timeout.should.equal @validRequest.compile.options.timeout * 1000
describe "with a resource without a path", -> describe "with a resource without a path", ->
beforeEach -> beforeEach ->
delete @validResource.path delete @validResource.path
@@ -166,7 +191,7 @@ describe "RequestParser", ->
it "should return the url in the parsed response", -> it "should return the url in the parsed response", ->
@data.resources[0].url.should.equal @url @data.resources[0].url.should.equal @url
describe "with a resource with a content attribute", -> describe "with a resource with a content attribute", ->
beforeEach -> beforeEach ->
@validResource.content = @content = "Hello world" @validResource.content = @content = "Hello world"
@@ -176,7 +201,7 @@ describe "RequestParser", ->
it "should return the content in the parsed response", -> it "should return the content in the parsed response", ->
@data.resources[0].content.should.equal @content @data.resources[0].content.should.equal @content
describe "without a root resource path", -> describe "without a root resource path", ->
beforeEach -> beforeEach ->
delete @validRequest.compile.rootResourcePath delete @validRequest.compile.rootResourcePath
@@ -216,13 +241,13 @@ describe "RequestParser", ->
} }
@RequestParser.parse @validRequest, @callback @RequestParser.parse @validRequest, @callback
@data = @callback.args[0][1] @data = @callback.args[0][1]
it "should return the escaped resource", -> it "should return the escaped resource", ->
@data.rootResourcePath.should.equal @goodPath @data.rootResourcePath.should.equal @goodPath
it "should also escape the resource path", -> it "should also escape the resource path", ->
@data.resources[0].path.should.equal @goodPath @data.resources[0].path.should.equal @goodPath
describe "with a root resource path that has a relative path", -> describe "with a root resource path that has a relative path", ->
beforeEach -> beforeEach ->
@validRequest.compile.rootResourcePath = "foo/../../bar.tex" @validRequest.compile.rootResourcePath = "foo/../../bar.tex"

View File

@@ -134,6 +134,30 @@ describe "ResourceWriter", ->
type: "aux" type: "aux"
}, { }, {
path: "cache/_chunk1" path: "cache/_chunk1"
},{
path: "figures/image-eps-converted-to.pdf"
type: "pdf"
},{
path: "foo/main-figure0.md5"
type: "md5"
}, {
path: "foo/main-figure0.dpth"
type: "dpth"
}, {
path: "foo/main-figure0.pdf"
type: "pdf"
}, {
path: "_minted-main/default-pyg-prefix.pygstyle"
type: "pygstyle"
}, {
path: "_minted-main/default.pygstyle"
type: "pygstyle"
}, {
path: "_minted-main/35E248B60965545BD232AE9F0FE9750D504A7AF0CD3BAA7542030FC560DFCC45.pygtex"
type: "pygtex"
}, {
path: "_markdown_main/30893013dec5d869a415610079774c2f.md.tex"
type: "tex"
}] }]
@resources = "mock-resources" @resources = "mock-resources"
@OutputFileFinder.findOutputFiles = sinon.stub().callsArgWith(2, null, @output_files) @OutputFileFinder.findOutputFiles = sinon.stub().callsArgWith(2, null, @output_files)
@@ -165,6 +189,46 @@ describe "ResourceWriter", ->
.calledWith(path.join(@basePath, "cache/_chunk1")) .calledWith(path.join(@basePath, "cache/_chunk1"))
.should.equal false .should.equal false
it "should not delete the epstopdf converted files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "figures/image-eps-converted-to.pdf"))
.should.equal false
it "should not delete the tikz md5 files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "foo/main-figure0.md5"))
.should.equal false
it "should not delete the tikz dpth files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "foo/main-figure0.dpth"))
.should.equal false
it "should not delete the tikz pdf files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "foo/main-figure0.pdf"))
.should.equal false
it "should not delete the minted pygstyle files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "_minted-main/default-pyg-prefix.pygstyle"))
.should.equal false
it "should not delete the minted default pygstyle files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "_minted-main/default.pygstyle"))
.should.equal false
it "should not delete the minted default pygtex files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "_minted-main/35E248B60965545BD232AE9F0FE9750D504A7AF0CD3BAA7542030FC560DFCC45.pygtex"))
.should.equal false
it "should not delete the markdown md.tex files", ->
@ResourceWriter._deleteFileIfNotDirectory
.calledWith(path.join(@basePath, "_markdown_main/30893013dec5d869a415610079774c2f.md.tex"))
.should.equal false
it "should call the callback", -> it "should call the callback", ->
@callback.called.should.equal true @callback.called.should.equal true

View File

@@ -65,6 +65,22 @@ describe 'TikzManager', ->
@callback.calledWithExactly(null, false) @callback.calledWithExactly(null, false)
.should.equal true .should.equal true
describe "and the main file contains \\usepackage{pstool}", ->
beforeEach ->
@SafeReader.readFile = sinon.stub()
.withArgs("#{@compileDir}/#{@mainFile}")
.callsArgWith(3, null, "hello \\usepackage[random-options]{pstool}")
@TikzManager.checkMainFile @compileDir, @mainFile, @resources, @callback
it "should look at the file on disk", ->
@SafeReader.readFile
.calledWith("#{@compileDir}/#{@mainFile}")
.should.equal true
it "should call the callback with true ", ->
@callback.calledWithExactly(null, true)
.should.equal true
describe "injectOutputFile", -> describe "injectOutputFile", ->
beforeEach -> beforeEach ->
@rootDir = "/mock" @rootDir = "/mock"

View File

@@ -7,17 +7,18 @@ EventEmitter = require("events").EventEmitter
describe "UrlFetcher", -> describe "UrlFetcher", ->
beforeEach -> beforeEach ->
@callback = sinon.stub() @callback = sinon.stub()
@url = "www.example.com/file" @url = "https://www.example.com/file/here?query=string"
@UrlFetcher = SandboxedModule.require modulePath, requires: @UrlFetcher = SandboxedModule.require modulePath, requires:
request: defaults: @defaults = sinon.stub().returns(@request = {}) request: defaults: @defaults = sinon.stub().returns(@request = {})
fs: @fs = {} fs: @fs = {}
"logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() } "logger-sharelatex": @logger = { log: sinon.stub(), error: sinon.stub() }
"settings-sharelatex": @settings = {}
it "should turn off the cookie jar in request", -> it "should turn off the cookie jar in request", ->
@defaults.calledWith(jar: false) @defaults.calledWith(jar: false)
.should.equal true .should.equal true
describe "_pipeUrlToFile", -> describe "rewrite url domain if filestoreDomainOveride is set", ->
beforeEach -> beforeEach ->
@path = "/path/to/file/on/disk" @path = "/path/to/file/on/disk"
@request.get = sinon.stub().returns(@urlStream = new EventEmitter) @request.get = sinon.stub().returns(@urlStream = new EventEmitter)
@@ -26,21 +27,54 @@ describe "UrlFetcher", ->
@urlStream.resume = sinon.stub() @urlStream.resume = sinon.stub()
@fs.createWriteStream = sinon.stub().returns(@fileStream = new EventEmitter) @fs.createWriteStream = sinon.stub().returns(@fileStream = new EventEmitter)
@fs.unlink = (file, callback) -> callback() @fs.unlink = (file, callback) -> callback()
@UrlFetcher.pipeUrlToFile(@url, @path, @callback)
it "should request the URL", -> it "should use the normal domain when override not set", (done)->
@request.get @UrlFetcher.pipeUrlToFile @url, @path, =>
.calledWith(sinon.match {"url": @url}) @request.get.args[0][0].url.should.equal @url
.should.equal true done()
@res = statusCode: 200
@urlStream.emit "response", @res
@urlStream.emit "end"
@fileStream.emit "finish"
it "should use override domain when filestoreDomainOveride is set", (done)->
@settings.filestoreDomainOveride = "192.11.11.11"
@UrlFetcher.pipeUrlToFile @url, @path, =>
@request.get.args[0][0].url.should.equal "192.11.11.11/file/here?query=string"
done()
@res = statusCode: 200
@urlStream.emit "response", @res
@urlStream.emit "end"
@fileStream.emit "finish"
describe "pipeUrlToFile", ->
beforeEach (done)->
@path = "/path/to/file/on/disk"
@request.get = sinon.stub().returns(@urlStream = new EventEmitter)
@urlStream.pipe = sinon.stub()
@urlStream.pause = sinon.stub()
@urlStream.resume = sinon.stub()
@fs.createWriteStream = sinon.stub().returns(@fileStream = new EventEmitter)
@fs.unlink = (file, callback) -> callback()
done()
describe "successfully", -> describe "successfully", ->
beforeEach -> beforeEach (done)->
@UrlFetcher.pipeUrlToFile @url, @path, =>
@callback()
done()
@res = statusCode: 200 @res = statusCode: 200
@urlStream.emit "response", @res @urlStream.emit "response", @res
@urlStream.emit "end" @urlStream.emit "end"
@fileStream.emit "finish" @fileStream.emit "finish"
it "should request the URL", ->
@request.get
.calledWith(sinon.match {"url": @url})
.should.equal true
it "should open the file for writing", -> it "should open the file for writing", ->
@fs.createWriteStream @fs.createWriteStream
.calledWith(@path) .calledWith(@path)
@@ -55,7 +89,10 @@ describe "UrlFetcher", ->
@callback.called.should.equal true @callback.called.should.equal true
describe "with non success status code", -> describe "with non success status code", ->
beforeEach -> beforeEach (done)->
@UrlFetcher.pipeUrlToFile @url, @path, (err)=>
@callback(err)
done()
@res = statusCode: 404 @res = statusCode: 404
@urlStream.emit "response", @res @urlStream.emit "response", @res
@urlStream.emit "end" @urlStream.emit "end"
@@ -66,7 +103,10 @@ describe "UrlFetcher", ->
.should.equal true .should.equal true
describe "with error", -> describe "with error", ->
beforeEach -> beforeEach (done)->
@UrlFetcher.pipeUrlToFile @url, @path, (err)=>
@callback(err)
done()
@urlStream.emit "error", @error = new Error("something went wrong") @urlStream.emit "error", @error = new Error("something went wrong")
it "should call the callback with the error", -> it "should call the callback with the error", ->