- [http://www.swi-prolog.org/download/devel/src/pl-6.3.19.tar.gz SWI Prolog version 6.3.19] uploaded from [http://www.swi-prolog.org/download/devel?show=all this SWI developers page]
+
- [[http://www.swi-prolog.org/download/devel/src/pl-6.3.19.tar.gz | SWI Prolog version 6.3.19]] uploaded from [[http://www.swi-prolog.org/download/devel?show=all | this SWI developers page]]
- C&C (local distribution)
- C&C (local distribution)
- Boxer, FRED (local distrib with links to fredlib, api, etc.)
- Boxer, FRED (local distrib with links to fredlib, api, etc.)
-
- [https://stanfordnlp.github.io/CoreNLP/ CoreNLP] (note: it works only with [http://nlp.stanford.edu/software/stanford-corenlp-full-2014-08-27.zip version 3.4.1])
+
- [[https://stanfordnlp.github.io/CoreNLP/ | CoreNLP]] (note: it works only with [[http://nlp.stanford.edu/software/stanford-corenlp-full-2014-08-27.zip | version 3.4.1]])
-
- [https://github.com/dasmith/stanford-corenlp-python CoreNLP Python wrapper for 3.4.1]
+
- [[https://github.com/dasmith/stanford-corenlp-python | CoreNLP Python wrapper for 3.4.1]]
-
- [http://babelfy.org/ Babelfy] instead of [http://tagme.di.unipi.it Tagme], as a service with our local credentials.
+
- [[http://babelfy.org/ | Babelfy]] instead of [[http://tagme.di.unipi.it | Tagme]], as a service with our local credentials.
==== SWI-Prolog ====
==== SWI-Prolog ====
Ligne 113:
Ligne 113:
</code>
</code>
-
===C&C===
+
===== C&C =====
The local Boxer - C&C distribution delivered by FRED should like
The local Boxer - C&C distribution delivered by FRED should like
As the [http://svn.ask.it.usyd.edu.au/trac/candc/wiki C&C web] is not visible at the moment, we have to go look for [http://web.archive.org/web/20160313031620/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/Installation old C&C installation documentation] to the way back machine
+
As the [[http://svn.ask.it.usyd.edu.au/trac/candc/wiki | C&C web]] is not visible at the moment, we have to go look for [[http://web.archive.org/web/20160313031620/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/Installation | old C&C installation documentation]] to the way back machine
So first we go to the candc directory
So first we go to the candc directory
+
<code>
$ cd BoxerServer/candc
$ cd BoxerServer/candc
+
</code>
Then we create a symbolic link to the makefile for unix
Then we create a symbolic link to the makefile for unix
-
$ ln -s Makefile.unix Makefile
+
<code> $ ln -s Makefile.unix Makefile </code>
And we make...
And we make...
-
$ make
+
<code> $ make </code>
-
You might get an error about permissions on ''src/scripts/version'' file, so you should ''chmod'' it
+
You might get an error about permissions on ''src/scripts/version'' file, so you should **chmod** it
-
chmod a+x src/scripts/version
+
<code> chmod a+x src/scripts/version </code>
If you get an errors of unrecognized object files like these
If you get an errors of unrecognized object files like these
-
src/main/pos.o: file not recognized: Format de fichier non reconnu
+
<code> src/main/pos.o: file not recognized: Format de fichier non reconnu </code>
...you should erase the file and make again
...you should erase the file and make again
-
$ rm src/main/pos.o
+
<code> $ rm src/main/pos.o
$ make
$ make
+
</code>
-
Afterwards I got a ''sleep'' function error.
+
Afterwards I got a **sleep** function error.
-
+
<code>
src/lib/extract/_baseimpl.cc: In member function ‘virtual void NLP::Extract::_BaseImpl::_pass1(NLP::IO::Reader&, bool)’:
src/lib/extract/_baseimpl.cc: In member function ‘virtual void NLP::Extract::_BaseImpl::_pass1(NLP::IO::Reader&, bool)’:
src/lib/extract/_baseimpl.cc:49:10: error: ‘sleep’ was not declared in this scope
src/lib/extract/_baseimpl.cc:49:10: error: ‘sleep’ was not declared in this scope
Ligne 162:
Ligne 168:
^
^
make: *** [src/lib/extract/_baseimpl.o] Erreur 1
make: *** [src/lib/extract/_baseimpl.o] Erreur 1
+
</code>
-
I doubted between using [http://www.cplusplus.com/reference/thread/this_thread/sleep_for/ ''sleep_for'' c++ function] or just comment the line on ''src/lib/extract/_baseimpl.cc''. I did the latter and apparently it worked:
+
I doubted between using [[http://www.cplusplus.com/reference/thread/this_thread/sleep_for/ | sleep_for c++ function]] or just comment the line on ''src/lib/extract/_baseimpl.cc''. I did the latter and apparently it worked:
+
<code>
// Commented by JGF for FRED 12/oct/16
// Commented by JGF for FRED 12/oct/16
// sleep(3);
// sleep(3);
+
</code>
-
====C&C Soap Server====
+
=== C&C Soap Server ===
-
There's [http://web.archive.org/web/20150304125339/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/InstallSOAP a dedicated doc page] for building the C&C Soap server. So here we go. The first thing we check is that there's and ''ext'' directory inside the 'candc' distrib:
+
There's [[http://web.archive.org/web/20150304125339/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/InstallSOAP | a dedicated doc page]] for building the C&C Soap server. So here we go. The first thing we check is that there's and **ext** directory inside the 'candc' distrib:
''GSoap'' is already installed in the ''ext'' directory:
+
//GSoap// is already installed in the ''ext'' directory:
+
<code>
$ ls ext/
$ ls ext/
bin gsoap-2.8 include lib share
bin gsoap-2.8 include lib share
+
</code>
-
So we follow the building instructions of the [http://web.archive.org/web/20150304125339/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/InstallSOAP C&C Soap server manual]:
+
So we follow the building instructions of the [[http://web.archive.org/web/20150304125339/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/InstallSOAP | C&C Soap server manual]]:
cd . && /bin/bash /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing --run aclocal-1.10
cd . && /bin/bash /opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing --run aclocal-1.10
/opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing: ligne 46: aclocal-1.10 : commande introuvable
/opt/FRED/BoxerServer/candc/ext/gsoap-2.8/missing: ligne 46: aclocal-1.10 : commande introuvable
Ligne 197:
Ligne 212:
this package. You may also peek at any GNU archive site, in case
this package. You may also peek at any GNU archive site, in case
some other package would contain this missing `aclocal-1.10' program.
some other package would contain this missing `aclocal-1.10' program.
+
</code>
-
As we don't have but an aclocal-1.14 version, I change manually the ext/gsoap-2.8/Makefile, editing lines 89 and 98 in order to update the ac-local version
+
As we don't have but an aclocal-1.14 version, I change manually the ''ext/gsoap-2.8/Makefile'', editing lines **89** and **98** in order to update the ac-local version :
Following [http://web.archive.org/web/20160313031620/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/Installation the documentation (Step 6)], we just go to the ''candc'' directory and make...
+
Following [[http://web.archive.org/web/20160313031620/http://svn.ask.it.usyd.edu.au/trac/candc/wiki/Installation | the documentation (Step 6)]], we just go to the //candc// directory and make...
+
<code>
$ cd /opt/FRED/BoxerServer/cand
$ cd /opt/FRED/BoxerServer/cand
$ make bin/boxer
$ make bin/boxer
Ligne 304:
Ligne 330:
% Autoloader: iteration 2 resolved 21 predicates and loaded 28 files in 0,074 seconds. Restarting ...
% Autoloader: iteration 2 resolved 21 predicates and loaded 28 files in 0,074 seconds. Restarting ...
% Autoloader: loaded 33 files in 3 iterations in 0,205 seconds
% Autoloader: loaded 33 files in 3 iterations in 0,205 seconds
+
</code>
Finally, we just check that the statistical models are there:
Finally, we just check that the statistical models are there:
FRED works only with Core NLP is 3.4.1, so we should go to [https://stanfordnlp.github.io/CoreNLP/history.html Stanford Core NLP release history page] in order to downloado this specific version.
+
FRED works only with Core NLP is 3.4.1, so we should go to [[https://stanfordnlp.github.io/CoreNLP/history.html | Stanford Core NLP release history page]] in order to download this specific version.
Now we follow the "[https://stanfordnlp.github.io/CoreNLP/cmdline.html Using Stanford CoreNLP from the command line]" documentation page. So we go to Core NLP root directory and run...
+
Now we follow the "[[https://stanfordnlp.github.io/CoreNLP/cmdline.html | Using Stanford CoreNLP from the command line]]" documentation page. So we go to Core NLP root directory and run...
According to the documentation, this command process a file called ''input.txt'' and produces an ''input.txt.xml'' file with POS, named entites and lemma annotation. There's some configuration to do (classpath, properties file) but we will wait until we know how exactly FRED uses Core NLP for further configuration.
According to the documentation, this command process a file called ''input.txt'' and produces an ''input.txt.xml'' file with POS, named entites and lemma annotation. There's some configuration to do (classpath, properties file) but we will wait until we know how exactly FRED uses Core NLP for further configuration.
-
===Python interface to Stanford Core NLP tools v3.4.1===
+
==== Python interface to Stanford Core NLP tools v3.4.1 ====
-
So we go back to the /opt/FRED/externals directory and clone [https://github.com/dasmith/stanford-corenlp-python.git Stanford Core NLP Python wrapper]
+
So we go back to the /opt/FRED/externals directory and clone [[https://github.com/dasmith/stanford-corenlp-python.git | Stanford Core NLP Python wrapper]]
We check python version and install pip and the wrapper dependencies:
We check python version and install pip and the wrapper dependencies:
+
<code>
$ python --version
$ python --version
Python 2.7.6
Python 2.7.6
$ sudo apt-get install python-pip
$ sudo apt-get install python-pip
$ sudo pip install pexpect unidecode
$ sudo pip install pexpect unidecode
+
</code>
-
The we follow [https://github.com/dasmith/stanford-corenlp-python/blob/master/README.md the python wrapper documentation], which specifies that Stanford Core NLP must be a child directory of the python wrapper, so we move our Core NLP directory inside the wrapper's directory:
+
The we follow [[https://github.com/dasmith/stanford-corenlp-python/blob/master/README.md | the python wrapper documentation]], which specifies that Stanford Core NLP must be a child directory of the python wrapper, so we move our Core NLP directory inside the wrapper's directory:
So we must install [http://www.nltk.org/install.html NLTK] because it looks like a dependecy for the wrapper:
+
So we must install [[http://www.nltk.org/install.html | NLTK]] because it looks like a dependecy for the wrapper:
+
<code>
$ sudo pip install -U nltk
$ sudo pip install -U nltk
$ python
$ python
>>> import nltk
>>> import nltk
$ sudo python -m nltk.downloader -d /usr/local/share/nltk_data all
$ sudo python -m nltk.downloader -d /usr/local/share/nltk_data all
+
</code>
We test again
We test again
+
<code>
$ python client.py
$ python client.py
Traceback (most recent call last):
Traceback (most recent call last):
File "client.py", line 18, in <module>
File "client.py", line 18, in <module>
tree = Tree.parse(result['sentences'][0]['parsetree'])
tree = Tree.parse(result['sentences'][0]['parsetree'])
+
</code>
We still have an error, but it doesn't look bad, so we're going to ignore it and move on.
We still have an error, but it doesn't look bad, so we're going to ignore it and move on.
-
===Babelfly===
+
==== Babelfly ====
-
I can't find any reference to entity disambiguation with Babelfly in FRED code, so I wont proceed to the installation from the [http://babelfy.org/download Babelfly download page]. Maybe it's a TODO to replace the Tagme calls (which are still inside FRED code) for Babelfly calls.
+
I can't find any reference to entity disambiguation with Babelfly in FRED code, so I wont proceed to the installation from the [[http://babelfy.org/download Babelfly | download page]]. Maybe it's a TODO to replace the Tagme calls (which are still inside FRED code) for Babelfly calls.