Cluster TAL:FRED

De wikiRcln
Aller à : navigation, rechercher

Installation procedure

System version

FRED was installed on May 12, 2016 on the FRED virtual machine of LIPN.RCLN.cluster.TAL

$ uname -a
Linux tal-fred 3.13.0-66-generic #108-Ubuntu SMP Wed Oct 7 15:20:27 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
$ cat /etc/issue
Ubuntu 14.04.3 LTS

Requirements

  1. SWI Prolog version 6.3.19 uploaded from this SWI developers page
  2. C&C (local distribution)
  3. Boxer, FRED (local distrib with links to fredlib, api, etc.)
  4. CoreNLP (note: it works only with version 3.4.1)
  5. CoreNLP Python wrapper for 3.4.1
  6. Babelfy instead of Tagme, as a service with our local credentials.

SWI-Prolog

Documentation procedure source:

/opt/FRED/externals/pl-6.3.19/README.linux
/opt/FRED/externals/pl-6.3.19/INSTALL

SWI Dependencies

  • gcc is already installed
$ gcc --version
gcc (Ubuntu 4.8.4-2ubuntu1~14.04) 4.8.4
Copyright (C) 2013 Free Software Foundation, Inc.
  • make is already installed
$ make --version
GNU Make 3.81
Copyright (C) 2006  Free Software Foundation, Inc.
  • installing autoconf
$ sudo apt-get install autoconf
  • looks like gmp library is already installed
$ locate gmp
/usr/include/boost/multiprecision/gmp.hpp
/usr/include/boost/polygon/gmp_override.hpp
/usr/include/linux/igmp.h
/usr/include/netinet/igmp.h
/usr/lib/x86_64-linux-gnu/libgmp.so.10
/usr/lib/x86_64-linux-gnu/libgmp.so.10.1.3
/usr/lib/x86_64-linux-gnu/openssl-1.0.0/engines/libgmp.so
  • looks like readline library is already installed
$ locate readline
/lib/x86_64-linux-gnu/libreadline.so.5
/lib/x86_64-linux-gnu/libreadline.so.5.2
/lib/x86_64-linux-gnu/libreadline.so.6
/lib/x86_64-linux-gnu/libreadline.so.6.3
  • libXt, X11 core libraries, libjpeg and libxpm
$ sudo apt-get install libxt-dev libjpeg-dev libxpm-dev
  • libXft, libfontconfig and pkg-config
$ sudo apt-get install libxft-dev libfontconfig1-dev pkg-config

SWI Prolog compile and build

We have to copy build.templ as build

$ cp build.templ build

We create a symbolic link for SWI Prolog

$ ln -s /opt/FRED/externals/pl-6.3.19 /opt/FRED/externals/SWI

Now we will edit lines 19 to 21 from /opt/FRED/externals/SWI/build for setting instalation path and sudo

PREFIX=/usr/local/
#SUDO=                                                                                                                                                        
SUDO="sudo"

...and commenting line 30, and uncommenting line 31

# MAKE=make
MAKE='make --jobs=4'

Then we run the build script (I had to run it twice to get a free error compilation).

$ ./build
make[2]: quittant le répertoire « /opt/FRED/externals/pl-6.3.19/packages/PDT »
make[2]: entrant dans le répertoire « /opt/FRED/externals/pl-6.3.19/packages/utf8proc »
mkdir -p /users/garciaflores/lib/swipl-6.3.19/doc/packages
/usr/bin/install -c -m 644 utf8proc.html /users/garciaflores/lib/swipl-6.3.19/doc/packages
make[2]: quittant le répertoire « /opt/FRED/externals/pl-6.3.19/packages/utf8proc »
make[2]: entrant dans le répertoire « /opt/FRED/externals/pl-6.3.19/packages/archive »
mkdir -p /users/garciaflores/lib/swipl-6.3.19/doc/packages
/usr/bin/install -c -m 644 archive.html /users/garciaflores/lib/swipl-6.3.19/doc/packages
make[2]: quittant le répertoire « /opt/FRED/externals/pl-6.3.19/packages/archive »
/usr/bin/install -c -m 644 index.html /users/garciaflores/lib/swipl-6.3.19/doc/packages
make[1]: quittant le répertoire « /opt/FRED/externals/pl-6.3.19/packages »

And then we test the installation

$ swipl --version
SWI-Prolog version 6.3.19 for x86_64-linux

C&C

The local Boxer - C&C distribution delivered by FRED should like

$ pwd
/opt/FRED/BoxerServer
$ ll
total 72
drwx------  3 garciaflores users  4096 févr.  4  2016 .
drwxrwxrwx  8 garciaflores users  4096 mai   13 00:51 ..
-rw-r--r--  1 garciaflores users  4096 févr. 10  2016 ._boxer options
-rw-r--r--  1 garciaflores users  3793 févr. 10  2016 boxer options
drwx------ 10 garciaflores users  4096 févr.  4  2016 candc
-rw-r--r--  1 garciaflores users  4096 févr. 10  2016 ._candc
-rw-r--r--  1 garciaflores users  4096 févr. 10  2016 ._candccommands.txt
-rw-r--r--  1 garciaflores users 15945 févr. 10  2016 candccommands.txt
-rw-r--r--  1 garciaflores users  4096 févr. 10  2016 ._conversion table.txt
-rw-r--r--  1 garciaflores users  3834 févr. 10  2016 conversion table.txt
-rw-r--r--  1 garciaflores users  4096 févr. 10  2016 ._.DS_Store
-rw-r--r--  1 garciaflores users  6148 févr. 10  2016 .DS_Store
-rw-r--r--  1 garciaflores users  4096 févr. 10  2016 ._launch.txt
-rw-r--r--  1 garciaflores users    95 févr. 10  2016 launch.txt

As the C&C web is not visible at the moment, we have to go look for old C&C installation documentation to the way back machine

So first we go to the candc directory

$ cd BoxerServer/candc

Then we create a symbolic link to the makefile for unix

$ ln -s Makefile.unix Makefile

And we make...

$ make

You might get an error about permissions on src/scripts/version file, so you should chmod it

 chmod a+x src/scripts/version

If you get an errors of unrecognized object files like these

src/main/pos.o: file not recognized: Format de fichier non reconnu

...you should erase the file and make again

$ rm src/main/pos.o
$ make

Afterwards I got a sleep function error.

src/lib/extract/_baseimpl.cc: In member function ‘virtual void NLP::Extract::_BaseImpl::_pass1(NLP::IO::Reader&, bool)’:
src/lib/extract/_baseimpl.cc:49:10: error: ‘sleep’ was not declared in this scope
  sleep(3);
         ^ 
make: *** [src/lib/extract/_baseimpl.o] Erreur 1

I doubted between using sleep_for c++ function or just comment the line on src/lib/extract/_baseimpl.cc. I did the latter and apparently it worked:

// Commented by JGF for FRED 12/oct/16
// sleep(3);

C&C Soap Server

There's a dedicated doc page for building the C&C Soap server. So here we go. The first thing we check is that there's and ext directory inside the 'candc' distrib:

$ pwd
/opt/FRED/BoxerServer/candc
$ ls
bin  ext         lib                LICENCE.txt  Makefile.cygwin  Makefile.macosx   Makefile.mingw  Makefile.targets  models       src        test.ccg
doc  grepsource  LICENCE-BOXER.txt  Makefile     Makefile.deps    Makefile.macosxu  Makefile.sunos  Makefile.unix     RELEASE.txt  test1.ccg  working

GSoap is already installed in the ext directory:

$ ls ext/
bin  gsoap-2.8  include  lib  share

So we follow the building instructions of the C&C Soap server manual:

$ cd ext/gsoap-2.8
$ chmod a+x configure
$ ./configure --prefix=/opt/FRED/BoxerServer/candc/ext/
$ make

I got an aclocal-1.10 error

cd . && /bin/bash /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing --run aclocal-1.10
/opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing: ligne 46: aclocal-1.10 : commande introuvable
WARNING: `aclocal-1.10' is needed, and you do not seem to have it handy on your
        system.  You might have modified some files without having the
        proper tools for further handling them.  Check the `README' file,
        it often tells you about the needed prerequirements for installing
        this package.  You may also peek at any GNU archive site, in case
        some other package would contain this missing `aclocal-1.10' program.

As we don't have but an aclocal-1.14 version, I change manually the ext/gsoap-2.8/Makefile, editing lines 89 and 98 in order to update the ac-local version

ACLOCAL = ${SHELL} /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing --run aclocal-1.14
# ACLOCAL = ${SHELL} /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing --run aclocal-1.10  
[...]
AUTOMAKE = ${SHELL} /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing --run automake-1.14
# AUTOMAKE = ${SHELL} /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing --run automake-1.10

And then run make again

$ make

It works for aclocal, but crashes because on a missing yacc error

make[4]: entrant dans le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap/src »
/bin/bash ../../ylwrap soapcpp2_yacc.y y.tab.c soapcpp2_yacc.c y.tab.h `echo soapcpp2_yacc.c | sed -e s/cc$/hh/ -e s/cpp$/hpp/ -e s/cxx$/hxx/ -e s/c++$/h++/ -e s/c$/h/`  y.output soapcpp2_yacc.output -- yacc -d -v
../../ylwrap: ligne 111: yacc : commande introuvable
make[4]: *** [soapcpp2_yacc.c] Erreur 1
make[4]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap/src »
make[3]: *** [all-recursive] Erreur 1
make[3]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap »
make[2]: *** [all] Erreur 2
make[2]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap »
make[1]: *** [all-recursive] Erreur 1
make[1]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8 »
make: *** [all] Erreur 2

So we install yacc, clean and make again

$ sudo apt-get install byacc flex
$ make clean
$ ./configure --prefix=/opt/FRED/BoxerServer/candc/
$ make

Yacc looks OK now, but it crashes on a bizarre linking error

gcc -DWITH_YACC -DWITH_FLEX  -DSOAPCPP_IMPORT_PATH="\"/opt/FRED/BoxerServer/candc/share/gsoap/import\"" -DLINUX -g -O2   -o soapcpp2 soapcpp2-soapcpp2_yacc.o soapcpp2- soapcpp2_lex.o soapcpp2-symbol2.o soapcpp2-error2.o soapcpp2-init2.o soapcpp2-soapcpp2.o -ly -lfl
/usr/bin/ld: ne peut trouver -ly
collect2: error: ld returned 1 exit status
make[4]: *** [soapcpp2] Erreur 1
make[4]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap/src »
make[3]: *** [all-recursive] Erreur 1
make[3]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap »
make[2]: *** [all] Erreur 2
make[2]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap »
make[1]: *** [all-recursive] Erreur 1
make[1]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8 »
make: *** [all] Erreur 2

Apparently, this -ly gcc option corresponds to line 299 of the Makefile

YACC_LIB = -ly

We will try installing the ml-yacc and libbison-dev Ubuntu libraries

$ sudo apt-get install ml-yacc libbison-dev

Now we get errors concerning ssl and crypto libraries, so we go for libssl-dev package

$ sudo apt-get install libssl-dev

And it's done!

make[5]: entrant dans le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap/wsdl »
make[5]: Rien à faire pour « all-am ».
make[5]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap/wsdl »
make[4]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap/wsdl »
make[3]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap »
make[2]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/gsoap »
make[2]: entrant dans le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8 »
make[2]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8 »
make[1]: quittant le répertoire « /opt/FRED/BoxerServer/candc/ext/gsoap-2.8 »

...so we go for the install

$ make install
+--------------------------------------------------------+
| You now have successfully built and installed gsoap.   |
|                                                        |
| You can link your programs with -lgsoap++ for          |
| C++ projects created with soapcpp2 and you can link    |
| with -lgsoap for C projects generated with soapcpp2 -c |
|                                                        |
| There are also corresponding libraries for SSL and     |
| zlib compression support (-lgsoapssl and lgsoapssl++)  |
| which require linking -lssl -lcrypto -lz               |
|                                                        |
| Thanks for using gsoap.                                |
|                                                        |
|               http://sourceforge.net/projects/gsoap2   |
+--------------------------------------------------------+

Boxer and statistical models

Following the documentation (Step 6), we just go to the candc directory and make...

$ cd /opt/FRED/BoxerServer/cand
$ make bin/boxer
%   boxer(printDrs) compiled into printDrs 0,00 sec, 85 clauses
%    semlib(drs2tacitus) compiled into drs2tacitus 0,00 sec, 101 clauses
%   boxer(tuples) compiled into tuples 0,01 sec, 276 clauses
%    semlib(drs2fol) compiled into drs2fol 0,00 sec, 122 clauses
%   semlib(drs2tex) compiled into drs2tex 0,00 sec, 154 clauses
%  boxer(output) compiled into output 0,02 sec, 823 clauses
[...]


% autoloading pce_goal_expansion:(append/3) from /usr/local/lib/swipl-6.3.19/library/lists
% autoloading pce_messages:(get/3) from /usr/local/lib/swipl-6.3.19/xpce/prolog/lib/pce
% autoloading pce_principal:(pce_info/1) from /usr/local/lib/swipl-6.3.19/xpce/prolog/lib/swi_compatibility
% autoloading pce_host:(send/2) from /usr/local/lib/swipl-6.3.19/xpce/prolog/lib/pce
% autoloading pce_portray:(portray_clause/1) from /usr/local/lib/swipl-6.3.19/library/listing
% Autoloader: iteration 2 resolved 21 predicates and loaded 28 files in 0,074 seconds.  Restarting ...
% Autoloader: loaded 33 files in 3 iterations in 0,205 seconds

Finally, we just check that the statistical models are there:

$ ls models/
boxer  chunk_quotes  muc  noquotes  pos           pos_questions  questions  super_noquotes   super_quotes
chunk  config        ner  parser    pos_noquotes  pos_quotes     super      super_questions  verbstem.list

And do some testing from C&C examples page

$ bin/candc --models models
# this file was generated by the following command(s):
#   bin/candc --models models
# this file was generated by the following command(s):
#   bin/candc --models models

You have to type a sentence (like 'I would like to go'):

I would like to go
1 parsed at B=0.075, K=20
1 coverage 100%
(xcomp to_3 like_2 go_4)
(aux like_2 would_1)
(ncsubj like_2 I_0 _)
(ncsubj go_4 I_0 _)
<c> I|I|PRP|I-NP|O|NP would|would|MD|I-VP|O|(S[dcl]\NP)/(S[b]\NP) like|like|VB|I-VP|O|(S[b]\NP)/(S[to]\NP) to|to|TO|I-VP|O|(S[to]\NP)/(S[b]\NP) go|go|VB|I-VP|O|S[b]\NP
1 stats 0.693147 25 25 comb 20 13 0 0

Stanford Core NLP v.3.4.1

FRED works only with Core NLP is 3.4.1, so we should go to Stanford Core NLP release history page in order to downloado this specific version.

$ cd /opt/FRED/externals/tgz
$ wget http://nlp.stanford.edu/software/stanford-corenlp-full-2014-08-27.zip
$ cd ..
$ unzip tgz/stanford-corenlp-full-2014-08-27.zip

Now we follow the "Using Stanford CoreNLP from the command line" documentation page. So we go to Core NLP root directory and run...

$ java -cp "*" -Xmx2g edu.stanford.nlp.pipeline.StanfordCoreNLP -annotators tokenize,ssplit,pos,lemma,ner,parse,dcoref -file input.txt
Adding annotator tokenize
TokenizerAnnotator: No tokenizer type provided. Defaulting to PTBTokenizer.
Adding annotator ssplit
edu.stanford.nlp.pipeline.AnnotatorImplementations:
Adding annotator pos
Reading POS tagger model from edu/stanford/nlp/models/pos-tagger/english-left3words/english-left3words-distsim.tagger ... done [1,5 sec].
Adding annotator lemma
Adding annotator ner
Loading classifier from edu/stanford/nlp/models/ner/english.all.3class.distsim.crf.ser.gz ... done [4,6 sec].
Loading classifier from edu/stanford/nlp/models/ner/english.muc.7class.distsim.crf.ser.gz ... done [2,5 sec].
Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... done [2,0 sec].
sutime.binder.1.
Initializing JollyDayHoliday for sutime with classpath:edu/stanford/nlp/models/sutime/jollyday/Holidays_sutime.xml
Reading TokensRegex rules from edu/stanford/nlp/models/sutime/defs.sutime.txt
Reading TokensRegex rules from edu/stanford/nlp/models/sutime/english.sutime.txt
oct. 17, 2016 11:54:14 PM edu.stanford.nlp.ling.tokensregex.CoreMapExpressionExtractor appendRules
INFOS: Ignoring inactive rule: null
oct. 17, 2016 11:54:14 PM edu.stanford.nlp.ling.tokensregex.CoreMapExpressionExtractor appendRules
INFOS: Ignoring inactive rule: temporal-composite-8:ranges
Reading TokensRegex rules from edu/stanford/nlp/models/sutime/english.holidays.sutime.txt
Adding annotator parse
Loading parser from serialized file edu/stanford/nlp/models/lexparser/englishPCFG.ser.gz ...done [0,6 sec].
Adding annotator dcoref 
Ready to process: 1 files, skipped 0, total 1
Processing file /opt/FRED/externals/stanford-corenlp-full-2014-08-27/input.txt ... writing to /opt/FRED/externals/stanford-corenlp-full-2014-08-27/input.txt.xml {
Annotating file /opt/FRED/externals/stanford-corenlp-full-2014-08-27/input.txt [1.505 seconds]
} [1.588 seconds]
Processed 1 documents
Skipped 0 documents, error annotating 0 documents
Annotation pipeline timing information:
TokenizerAnnotator: 0,0 sec.
WordsToSentencesAnnotator: 0,0 sec.
POSTaggerAnnotator: 0,0 sec.
MorphaAnnotator: 0,1 sec.
NERCombinerAnnotator: 0,4 sec.
ParserAnnotator: 0,9 sec.
DeterministicCorefAnnotator: 0,1 sec.
TOTAL: 1,5 sec. for 17 tokens at 11,3 tokens/sec.
Pipeline setup: 0,0 sec.
Total time for StanfordCoreNLP pipeline: 1,6 sec.

According to the documentation, this command process a file called input.txt and produces an input.txt.xml file with POS, named entites and lemma annotation. There's some configuration to do (classpath, properties file) but we will wait until we know how exactly FRED uses Core NLP for further configuration.

Python interface to Stanford Core NLP tools v3.4.1

So we go back to the /opt/FRED/externals directory and clone Stanford Core NLP Python wrapper

$ cd /opt/FRED/externals
$ git clone https://github.com/dasmith/stanford-corenlp-python.git

We check python version and install pip and the wrapper dependencies:

$ python --version
Python 2.7.6
$ sudo apt-get install python-pip
$ sudo pip install pexpect unidecode

The we follow the python wrapper documentation, which specifies that Stanford Core NLP must be a child directory of the python wrapper, so we move our Core NLP directory inside the wrapper's directory:

$ pwd
/opt/FRED/externals
$ ls
pl-6.3.19  stanford-corenlp-full-2014-08-27  stanford-corenlp-python  swi-prolog  tgz
$ mv stanford-corenlp-full-2014-08-27/ stanford-corenlp-python/
$ ln -s stanford-corenlp-python/stanford-corenlp-full-2014-08-27/ stanford-corenlp

Then we launch the wrapper's server

$ python corenlp.py
Loading Models: 5/5
INFO:__main__:Serving on http://127.0.0.1:8080

There's a client.py program for testing the wrapper:

$ python client.py
{u'sentences': [{u'parsetree': u'(ROOT (S (VP (NP (INTJ (UH Hello)) (NP (NN world)))) (. !)))',
                u'text': u'Hello world!',
                u'tuples': [[u'dep', u'world', u'Hello'],
                            [u'root', u'ROOT', u'world']],
                u'words': [[u'Hello',
                            {u'CharacterOffsetBegin': u'0',
                             u'CharacterOffsetEnd': u'5',
                             u'Lemma': u'hello',
                             u'NamedEntityTag': u'O',
                             u'PartOfSpeech': u'UH'}],
[...]
Traceback (most recent call last):
  File "client.py", line 17, in <module>
     from nltk.tree import Tree
ImportError: No module named nltk.tree

So we must install NLTK because it looks like a dependecy for the wrapper:

$ sudo pip install -U nltk
$ python
>>> import nltk
$ sudo python -m nltk.downloader -d /usr/local/share/nltk_data all

We test again

$ python client.py
Traceback (most recent call last):
  File "client.py", line 18, in <module>
     tree = Tree.parse(result['sentences'][0]['parsetree'])

We still have an error, but it doesn't look bad, so we're going to ignore it and move on.

Babelfly

I can't find any reference to entity disambiguation with Babelfly in FRED code, so I wont proceed to the installation from the Babelfly download page. Maybe it's a TODO to replace the Tagme calls (which are still inside FRED code) for Babelfly calls.

$ find . -name "*.py" -exec grep -Hn agme {} \;
./fred-corenlp/server-fred-paris.py:139:        tagmeEntities = {}
./fred-corenlp/server-fred-paris.py:142:                tagmeEntities = utils.tagme(cleanedText)
./fred-corenlp/server-fred-paris.py:143:                print "tagmeEntities",tagmeEntities,type(tagmeEntities)
./fred-corenlp/server-fred-paris.py:151:            for el in tagmeEntities:
./fred-corenlp/server-fred-paris.py:152:                if tagmeEntities[el]!="OK":
./fred-corenlp/server-fred-paris.py:153:                    posTag = tagmeEntities[el][0]
./fred-corenlp/server-fred-paris.py:154:                    lenTag = tagmeEntities[el][1]
./fred-corenlp/server-fred-paris.py:208:            for el in tagmeEntities:
./fred-corenlp/server-fred-paris.py:209:                if tagmeEntities[el]!="OK":
./fred-corenlp/server-fred-paris.py:210:                    posTag = tagmeEntities[el][0]
./fred-corenlp/server-fred-paris.py:211:                    lenTag = tagmeEntities[el][1]
./fred-corenlp/utils.py:110:def tagme(sentence):
./fred-corenlp/utils.py:117:    url = "http://tagme.di.unipi.it/api?text=%s&key=bc70153a603d9de7e79c244c41270913&lang=en" % (sentence
$ find . -name "*.py" -exec grep -Hn abelfly {} \;
[ ]

Configuration

First we will go to fred-corenlp directory

$ cd /opt/FRED/fred-corenlp

There, we will edit the config.py file to add candc path in line 5

CANDC_BIN_PATH = '/opt/FRED/BoxerServer/candc'

...and line 159 with the right nltk_data path

NLTK_PATH = '/usr/local/share/nltk_data'   

Then we go back to FRED root to edit Boxer's files

 $ cd ..
 $ emacs -nw localboxerclient localboxerserver

In both files we set candc root:

PREFIX=/opt/FRED/BoxerServer/candc

localboxerserver should look like:

!/bin/bash                                                                                             
PREFIX=/opt/FRED/BoxerServer/candc                  
$PREFIX/bin/soap_server --server localhost:9000 --models $PREFIX/models/boxer --candc-printer boxer --candc-int-betas "0 0 0 0 0"

Testing

We first go to FRED root

$ cd /opt/FRED

And launch the boxer server

$ sh launchboxerserver

We get a permission error, so we add execution attribute for both files

$ sudo chmod a+x BoxerServer/candc/bin/soap_client 
$ sudo chmod a+x BoxerServer/candc/bin/soap_server

And we get more errors:

$ sh launchboxerserver
/opt/FRED/BoxerServer/candc/bin/soap_server: 1: /opt/FRED/BoxerServer/candc/bin/soap_server: Syntax error: "(" unexpected
$ sh launchboxerclient
localboxerclient: 5: localboxerclient: /opt/FRED/BoxerServer/candc/bin/tokkie: Permission denied
/opt/FRED/BoxerServer/candc/bin/soap_client: 1: /opt/FRED/BoxerServer/candc/bin/soap_client: Syntax error: word unexpected (expecting ")")
ERROR: file /tmp/boxer.ccg does not exist

TODO: recompile soap clients and server paying attention to parenthesis...