Tips and Tricks.
Hello I'm looking for a list of telephone numbers, email addresses, LinkedIn profiles and websites of all the top family offices in the world. I'm not interested in the USA but I'm looking for every other region. The priority is the biggest ones first.
I need someone with strong background in Data Mining, Matlab, Python(specially image processing) and technical writing skills who can help with visualisation, patterns recognition and classification of smart meter dataset. In short we need to preprocess time series data available in .mat file for collection of features/patterns which are useful for classification. Then we need to convert data to images and classify using some image processing techniques in python/Matlab. Note that we need to document everything. e.g we need to document and collect different graphs from analysis phase and also for classification.
You'll have to complete a very simple task, about 15 minutes following an extremely detailed tutorial. You'll have to share some personal information with me, so if you are not comfortable with that better not to proceed. You need to respect at least one of this two conditions: - being a citizen of the european union (for example Spanish, Portuguese, Romanian, Bulgarian, Croatian, Slovenian, Slovakian, Lithuanian, Esthonian, Hungarian, Czech, Polish etc.) - have a valid passport I'll pay 75$ for the task
This project will involve alot of data that would like to be stored on the spreadsheet from Files or scrapped from website directly for quick loading. There are lots data files that the spreadsheet will need to load for updating purposes and the spreadsheet will be used for updating or saving backup of it. Your main requirement is to allow the database that will deal with maybe 100+ files loaded with data and combine certain data files together. At the same time you will do some scrapping and allow to modify the data. Looking for very good loading times so if you have experience working with large databases I want to hear from you. The loading and processing is important to me so please have very good experience dealing with lot of data
hi there , looking for someone to collect data base for a project, will need to be able to ready english well, have a focus of attention to detail, 10 hours work, maybe become more if you do the job efectively. task will include search up retail shops and finding emails, and phone details,
The Gaming Stars app allows gamers to compete in PlayStation, Xbox and PC games for real money. We are looking for an expert who can help us to track in-app user behavior with the above tools and optimize our marketing with data-based decisions
You are required to prepare technical assessment questions about Data Science for a pre-assessment platform. The questions should be in the forms of multiple choice, free text and coding, and consist of different areas which are expected to be known by a Data Scientist at a hiring phase. Multiple choice questions should consist of options and free text questions should have 5 paraphrased answers. Each question should have its title generated from the specific topic included in the question. The questions should be about the following topics: Data: Databases / Mongo DB / SQL / R / Python / Algorithms / Machine Learning Techniques / Model Training You can find examplary questions from the attached document.
Hello, I need help with a project regarding Data Engineering; Data Science and Data Engineering techniques to enable data-driven decision-making with modern software tooling Investigate and demonstrate modern techniques to structure and manage access to data as well as deploy and monitor Machine Learning models to generate Data-driven insights. This task involves the exercises which involve the following; Data Analytics Pipeline: a. Planning and Design b. Implementation and Deployment c. Application and Analysis If interested, please contact me.
Need a python developer having experience of 5+ years, and has a great understanding of data structure & algorithms, graph, python libraries, can store and represent graph data using appropriate data structures, Implement fundamental graph traversal techniques, Analyze cohesive subgraph models and their representative computation algorithms, Analyze and implement basic machine learning methods, Implement techniques for graph embedding, graph neural networks. Please read the description carefully, your skils will be tested before handling the project.
The sinking of the Titanic is one of the most infamous shipwrecks in history. On April 15, 1912, during her maiden voyage, the widely considered “unsinkable” RMS Titanic sank after colliding with an iceberg. Unfortunately, there weren’t enough lifeboats for everyone onboard, resulting in the death of 1502 out of 2224 passengers and crew. While there was some element of luck involved in surviving, it seems some groups of people were more likely to survive than others. We would want to build a predictive model to predict what people are more likely to survive Titanic sinking? The data is grouped according to whether or not a person survived (1=significant, 0=insignificant). Download the data from D2L, and the following steps that will guide you how to build a data mining mo...
i would need a basic web scrapper, specific for 5 websites, that extract their data, and after rearrange the entries to have a unique format, permits some basic cross data analysis function. Language better if python, but i am open to some hybrid solution for the scrap. More info directly.
Looking for a professional that can deliver new updated data of Super Car Owners And Jet Owners / For upcoming events in Los Angeles and Miami / we are looking for data from those 2 states of Name Emails Social media etc..
Extract from Business Network(see attached file) based on profile of professional as any emails and link to their profile in a spread sheet any of the words attached HAS TO BE APPEAR in their profile description , otherwise we wont accept the record Give us a quote for 500 emails , 1000 emails and 5000 emails , 10K , etc Each entry in a spreadsheet with the fields: Name, Email address, link to profile, Keyword found in profile (has to be any of the list attached) We need it in 3-5 days you can start with 500 email /records your delivery is an excel spread sheet based on what we asked in this project description box and what was attached for this project We pay $50 for 500 Emails - Must be good emails and related We need it ASAP You can do manual or auto and show us the sc...
I'm looking to crawl 250 Million unique URLs and scrape the contact information from them using AI / NLP / Regex. The scrape should check the seed URL, about page, contact page, support, and faq for the information Below is the information I'm looking to scrape. All the information is unstructured and in natural language. Business / Brand name Business / Brand description Is business (true or false) Website Category (ie e-commerce, blog,etc) Website subcategory( ie clothing store) Emails Address (using NLP *not just the <address> tag) Phone Numbers Twitter Youtube Facebook LinkedIn Instagram I would like the export to be exported into a CSV/database. you will have to use a combination of regex and NLP to extract the information Please only apply if you have experien...
Contact Center is a very important touch point for customer service. Customers interact with the contact center for a variety of services fulfillment, information about products and services, complaints, feedback, and interactions. For enhancing the contact center service levels, the customer interactions need to be analyzed and need to be mapped with customer satisfaction. Banks want to automate the process with use of AI based techniques to measure and monitor various KPIs such as call data quality, customer sentiment, Call Hygiene, customer satisfaction etc. Solution Expected Convert speech recordings into text for various Indian Languages Call recordings with mixed languages such as Hindi + English, English + Marathi, etc. Provide sentiment analysis from the call center recordings...
I'd like someone who can scrape web data using python script, you must be able to write python scraping and spider script write multi threading script, optimize the code to reduce run time be able to write proxy configuration, it will be multi millions of data, so you need to be an expert for this. PM me for details
We need a manual collector or scarper to collect emails based on keyword in profile of person where you grab email from .. See full specs attached Simple , easy job we pay $50 for 500 contacts / records you must have sales navigator tp do this project make sure keyword appeared in candidate profile Thx We need it ASAP
The task is to perform data exploration using R
I have a list of 240 million websites I'd like to crawl and want to do that in the most cost-effective /quickest way. This would likely be accomplished with a web crawler that you would create to categorize the websites it crawls and sorts them into various categories while extracting key information such as. Business name, emails, phone numbers, addresses. I would also like to track the technologies the websites used for example Shopify. Here is how I would like websites to be processed 1. Determine if the website is in English 2. Determine the category of the website. 3. Determine if the website is for a business 5. Determine the category of business 6. If website is a business scrape the information using NLP (get as many pre-determined fields as possible) 7. Schedule to update ...
I need a large amount of sports data scraped from a website. You can either code a scraping tool for me or do the actual scraping, but if the latter, it still needs to be with some automated method because this involves many thousands of pages. The scraping process should be unintrusive not to cause any issues to the website. The data needs to be neatly collected into an Excel spreadsheet.
We have opportunities for qualified Data Engineering Specialist to work for a leading telecommunications company in Sydney, Australia. We're seeking an experienced data engineering specialist who has skills and experience in implementing large scale big data platforms. Our big data / data analytics technologies: Postgresql, Apache Spark, Apache Kafka, Apache HDFS, Apache Nifi, Apache Hive, Apache Flink, Apache Druid, Scala and Python Programming language, Kubernetes(K8s)/Rancher/Docker, ELK, Internet of Things, Data Science, AI and ML Platforms
Hello, I am looking for someone to pull the data of the things highlighted in the red box on this image below. I would also like to have the film festival link from the website. I am including the website that has all of the films and the image If you can do this job please type I am ready now at the top of your message so I know you have read the job
Assistance with efficiently cleaning and organizing our File Transfer Protocol (FTP)
We need a File Transfer Protocol expert who can assist with determining which files in our FTP are not being used and removing those files in bulk. If you have the knowledge to assist us, please submit your bid.
I want to be able to get the info on all the pinned locations in the map at the below location, and all of the information that is linked to from where it says: "Click here for Site information" I also want all that information to be put into a spreadsheet once its been downloaded. I want to know the exact process for doing this so that I can do it myself whenever I want. Is this possible?
We have a dedicated Linux server We need to install pool mining on it Of course, completely free and open source portals are also available in this context. If they are installed, we expect them to be completed according to our wishes. Such as: viabtc and