lista de musicas com programação em python vinculando a programação em banco de dados cassandra, ou mongodb ou redis, o que for melhor por opção
Folks, Im looking for an expert on Java grails and elastic db to complete the project. This is not a brand new project. The developer had some some urgency to leave the project, thus Im looking for a right candidate to run and complete the project. This is for an online classified site. Considering the online classified site as a baseline information, if you look at the attached document, you will be able grab the idea. 1. I need the exact price (Not a placeholder bid). If you need any further information, you may send it in your proposal 2. I need the exact time to complete (code, bug fixing) the project
I am looking for a website development similar to [url removed, login to view] Would need project to be delivered using the below languages Language - Node Js Framework - Express Js Database - MongoDb Replica Sets - For high data availability User End Post a complaint Company Name (Auto populated if exists or creates new) Subject Description Category , sub category (Added from Admin) Country, state, city , auto populated (Added from Admin) Zip Code City Website Photos / Videos Captcha Browse Compliant List Detailed view Comments Ratings Search Compliant Keyword matching subject, company name and description Ask a Question Subject Details Category ( as listed above) Captcha Signup / Login Facebook Login Gmail Login User Account Edit profile Change Password My Compliant My Comments My Questions Company Details Name Website Phone Address Compliant Total, Pending, Resolved Crawl Bot Data scraping from [url removed, login to view] - Only Company name with Category, Phone Number and Address. SEO support Admin : Admin Login and Sub admin Login Manage Categories Manage Compliant ( Delete Complaints individually and whole set of complaints too , Also need an option to delete the complaints and have the profile page Active. ) Manage Questions Manage Users ( Ban option needed ) Manage Country , State , City Manage Company Manage Comments Managing Ads Section Needed - Google Ads script or other Code too .
Need to create POC on usage of graph database to replace the existing elastic search. We will have to create a prototype data model with the existing search data upload file using neo4j as tool to highlight all the relationships and nodes and stress on why graph database is better than existing elastic search for querying. The primary tool we will be using to create the graph database model is neo4j.
As we are new to this, we ask for a general advisory instruction for an initial implementation. Subsequently, the freelancer will have access to virtual machines on which the stack must be installed and configured. Filebeats (or whatever it takes) will be installed on a key server in production. The final objective will be to manipulate the visualization from Kibabana on the main variables of these registers.
we have a multi node ES cluster. Here are the problems to be fixed: 1. when 1 node goes down it takes almost 30min to recover from the state. 2. when a heavy aggregation querry is run that puts the entire health of cluster at stake, we should have a way to kill the querry
Hello, I want to make a very basic dashboard on Redash using mongodb database
We are seeking a developer to partner with other engineering teams to help architect and build the data pipeline that ingest hundreds of billions of data points for our Field Analytics Platform utilizing AWS. Expand capability using various open source data processing technologies like Hadoop, Kafka, Spark, Cassandra and Neo4J into our infrastructure. Become an expert of AWS services that we leverage. Help to efficiently integrate our big data infrastructure in the AWS cloud. Build services, deploy models, algorithms, perform model training and provide tools to make our infrastructure more accessible to all our data scientists. Enable specific initiatives to build our capabilities from environmental classification to In-Season Field Analytics and more. Requires a degree or 15+ years experience in field or related area.
import CDR information in .csv performing concurrent call calculation prior to inserting into Elasticsearch using logstash Then Plot a line time graph, using Timelion in Elasticsearch/Kibana to provide a graph of Concurrent Calls over time for the time duration of the .CSV file given. Calculate the number of Concurrent calls for each second of duration in the file attached .CSV. For each call that starts in a given minute of time, increment concurrent calls counter. Set a timer to decrement until the end of the time call, then Decrement concurrent calls counter. the field in each CDR Indicating call start time is “setup_start_ts” and “call_time” field gives us The duration in milliseconds of the call. The time is measured from the final successful message until the first BYE message. this will give us a count of the number of Concurrent calls to insert into elasticsearch database. please use PYTHON to implement this calculation and any other code you need to write plot that number on a Timelion graph in Kibana Provide a display option to enable display of minimum and maximum 1 second values When the plot on the graph is an averaged over 1 minute or more using the 1 second values you have placed in the ELS database. Provide the ability from the timelion graph GUI to set the time axis to display with the following time spans: • 5 minutes • 20 minutes • 1 hour • 6 hours • 24 hours • 3 days • 7 days To confirm, these .CSVs use: • Semi-colons (;) for field separators • Field values are quoted when necessary (ie: if there's a semi-colon in the field value, it will be quoted--eg ";tag=3948u522;phone=whatever", etc. For example, when using the Python CSV module to read these, the class would be invoked as follows: import csv cdr_reader = [url removed, login to view](input_file, delimiter=';', quotechar='"') Please send us the code. This is just an extremely small proof of concept project. There will be measured many more larger ones to follow up on quick, bug free and successful completion of this one. Purely for informational purposes, Here is a link to a similar project within elasticsearch using Avaya CDRs. We do NOT need you to support Avaya CDRs [url removed, login to view]
Hello, We are looking to utilize Scrapy (Scrapy+Redis or other option) to run on our internal servers to handling data mining for the purposes of price monitoring, alerting and adjustments. We need a freelancer who is skilled with scrapy already building the environment from the ground up (its ok to use templates and such). It needs to have the following functions at minimum: 1) Scrapy running from scaleable server environment 2) Documented method for business users to add websites/fields to be scraped 3) Store Data 4) Data Reporting 5) Anti-Blacklisting (IP Rotation, Proxies, Check [url removed, login to view], Throttling, Spoofing, Headless Browser) 6) Quality Assurance / Compliance The project must be well documented in a understandable way in case you leave the project at some point. We also need you to stay on board long term to help maintain this environment. We will provide all the infrastructure and associated items. Payment: 50% milestone funded upfront, second 50% funded when working model shown. 100% released once you finish and it is fully functional with all items in scope of this project complete. Any items not specifically mentioned in the original project are not valid. If you bid on this project, your bid is final price for initial project.
We have an existing PHP / Vue JS system with a SQL database. Basically a searchable database of small business profiles, where large companies can search for small suppliers and send out tenders to them. Additional modules for other functions such as downloadable business resources etc. System is 60% complete and we need to finish it off and finalise some features. We estimate 15 to 30 days work.
Lider w produkcji nowoczesnych systemów do zarządzania ryzykiem i obsługi internetowych zakładów sportowych poszukuje doświadczonych specjalistów do pracy w charakterze Senior Python Developer. Oferujemy bardzo urozmaiconą i pełną wyzwań prace z użyciem najnowszych technologii zdecydowanie nastawioną na produkcję nowych rozwiązań dla dużych klientów na rynku EU i polskim. Opis stanowiska: - Projektowanie i programowanie rozwiązań backendowych dla platformy transakcyjnej i do zarządzania ryzykiem - Optymalizacja przetwarzania danych pochodzących od zewnętrznych dostawców - Praca z użyciem najnowszych technologii w zaawansowanym technologicznie zespole Oczekiwania wobec kandydatów: - Znajomość języka: Python (Flask, Django, Celery) - Znajomość nierelacyjnych baz danych (Redis, MongoDB) - Znajomość relacyjnych baz danych - Znajomość Elasticsearch będzie dodatkowym atutem - Doświadczenie w towrzeniu REST API Oferujemy: - Współpraca B2B - Atrakcyjne wynagrodzenie 10000-15000 netto - Bogaty pakiet socjalny - Praca w biurze w Łodzi, Warszawie lub Białymstoku Zgrany, młody i ambitny zespół