mirror of
https://github.com/CodeforLeipzig/stadtratmonitor.git
synced 2025-04-04 09:46:43 +02:00
Update README.md
This commit is contained in:
parent
3d0f764320
commit
80d02fd7c2
1 changed files with 11 additions and 15 deletions
26
README.md
26
README.md
|
@ -53,20 +53,18 @@ docker.
|
||||||
|
|
||||||
### Local machine setup
|
### Local machine setup
|
||||||
1. Install Ruby, Bundler, Elasticsearch
|
1. Install Ruby, Bundler, Elasticsearch
|
||||||
1. Start Elasticsearch: `elasticsearch`
|
2. Start Elasticsearch: `elasticsearch`
|
||||||
1. Setup Rails app: `bundle && bundle exec rake db:setup`
|
3. Setup Rails app: `bundle && bundle exec rake db:setup`
|
||||||
1. See "Importing data" below
|
4. See "Importing data" below
|
||||||
1. Start Rails server: `bundle exec rails s`
|
5. Start Rails server: `bundle exec rails s`
|
||||||
1. Visit [http://localhost:3000](http://localhost:3000)
|
6. Open [http://localhost:3000](http://localhost:3000)
|
||||||
|
|
||||||
### Using docker
|
### Using docker
|
||||||
|
|
||||||
1. Install docker and docker-compose: https://docs.docker.com/compose/install/
|
1. Install docker and docker-compose: https://docs.docker.com/compose/install/
|
||||||
1. Start the app: `docker-compose up`
|
2. Start the app: `docker-compose up`
|
||||||
1. Initialize the database: `docker-compose run --user srm web rake db:setup'`
|
3. Initialize the database: `docker exec stadtratmonitor-web sh -c "rake db:setup && rake import_papers && rake index:rebuild"`
|
||||||
1. See "Importing data" below
|
4. Open [http://localhost:3000](http://localhost:3000)
|
||||||
1. Get the address of the docker host: `docker-machine ip default`
|
|
||||||
1. Point your browser to: 'http://\<IP of docker host\>:3000'
|
|
||||||
|
|
||||||
```
|
```
|
||||||
rake assets:clean
|
rake assets:clean
|
||||||
|
@ -80,15 +78,13 @@ rake assets:precompile
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
### Importing data and building the index
|
### Importing data via allris-scraper (optional)
|
||||||
1. You can use [our allris-scraper](https://github.com/CodeforLeipzig/allris-scraper) to download the papers (resp. their links) from the [OPARL](https://oparl.org/) API, this will produce an input.json file
|
1. You can use [our allris-scraper](https://github.com/CodeforLeipzig/allris-scraper) to download the papers (resp. their links) from the [OPARL](https://oparl.org/) API, this will produce an input.json file
|
||||||
1. Put this input.json to a public web server and set the URL to this file then in:
|
2. Put this input.json to a public web server and set the URL to this file then in:
|
||||||
`app/controllers/import_controller.rb` and `lib/tasks/import_papers.rake`
|
`app/controllers/import_controller.rb` and `lib/tasks/import_papers.rake`
|
||||||
1. Import the data from our scraper: `docker-compose run --user srm web rake import_papers'`
|
|
||||||
1. Build the elasticsearch index: `docker-compose run --user srm web rake index:rebuild'`
|
|
||||||
|
|
||||||
### Running tests
|
### Running tests
|
||||||
|
|
||||||
Assuming docker and docker-compose is installed:
|
Assuming docker and docker-compose is installed:
|
||||||
|
|
||||||
1. `docker-compose run --user srm web bin/run-tests`
|
1. `docker-compose run web bin/run-tests`
|
||||||
|
|
Loading…
Add table
Reference in a new issue