|
3 years ago | |
---|---|---|
.vscode | 3 years ago | |
fixtures | 3 years ago | |
src | 3 years ago | |
.gitignore | 3 years ago | |
README.md | 3 years ago | |
composer.json | 3 years ago | |
composer.lock | 3 years ago | |
phpstan.neon | 3 years ago |
Very basic use case to load csv datas into elasticsearch in order to perform an aggregate request.
Here you can find csv compressed dataset files.
Uncompress before use.
Remind that logstash should have full access to the import directory.
Here you can find a logstash template importer.
Change pathes and filter columns according your requirements.
Copy to /etc/logstash/conf.d
# Ensure elastic instance is up
sudo systemctl status elasticsearch.service
# If not start it
sudo systemctl start elasticsearch.service
# Start logstash
sudo systemctl start logstash.service
Here you can find some template requests.
Here you can find some sample response.
composer install
composer start