I don't need customization for my wordpress everything is perfect for some unknown reason my website taking so much time to load the first byte i need to reduce the the response time I am using Amazon light sail services check the report [login to view URL] NO AUTO REPLY
Expanded evidence-based treatments. .promotion of research on related health services and suicide prevention.
Skills: • 4-8 years of experience on Big Data platform like Hadoop, Map/Reduce, Spark, HBase, CouchDB, Hive, Pig etc. • Experienced with data modeling, design patterns, building highly scalable and secured analytical solutions • Used SQL, PL/SQL and similar languages, UNIX shell scripting • Worked with TeraData, Oracle, MySQL, Informatica, Tableau
...HDFS, and also can't consume the data from HDFS/ S3 using Spark, due to error on the miss of hadoop client. HDFS Cluster and Spark are set up and show healthy states. Client's [login to view URL] and [login to view URL] are also downloaded and placed under local hadoop installation, but. 1) 'hdfs dfs -ls /' lead to error 2) spark complains on problem to ...
I can provide support for Hadoop/Big Data Administration and help you with ongoing projects. - Well Experienced in Big Data (Hadoop | Kafka | Spark | NoSQL | Elasticsearch | Cloud) Administration and Platforms to Accommodate the Expanding Business Needs. - Well Experienced in Different Vendors (Hortonworks, Cloudera, MapR, etc) of Big Data and Cloud
...results through the Featured Vector stored in HDFS format (Hadoop Distributed File Format). v. During Brain Tumor classification, we will apply a classifier on featured vector, those will be input for the classifier. vi. Specified classification has to be applied to parallel computation (Map Reduce Frame Work) and on the classifier, to get the expected
I work for a German engineering company and I have to prepare the market entry for a special recycling technology in Hungary, that will allow a decentralized recycling from waste engine oil and unrecycable plastic waste to a diesel fuel. For that I need to evaluate Companies with access to bigger amounts of waste mineral oil or plastic. 20 ton per month are the lower limt. The main intenti...
This is a small part of a big project. Need to have complete knowledge of Hadoop, How the name node and data node read-write functionality work, Creating a heterogeneous cluster in the cloud(open stack is desirable), push the change to the Hadoop. Desirable language JAVA. help is provided when asked. We do have only 2 weeks time for this. Write operation
Predict the loan_status(0 or 1) for the approved loans data. Based on the data given Project should be done using Hadoop Map reduce and Logistic regression If you cannot do using logistic regression then choose any other supervised machine learning technique. Train and Test data are given. Use train data to train and test data to test Use Python or
...help in building a cybersecurity analytics platform using Apache Stack such as Nifi (Log Collector), Kafka (communication), Storm (Real-time stream computation system), Strata Hadoop (S3), Metron (Analytics framework), Elastic search and for monitoring (Zabbix, Grafana for visualisation, ElastAlert for Platform alerting). I'm open to hearing other ideas
Job Description: we are looking for a data governance developer Should have good knowledge of schema, evolution, data leniance, hadoop java spark, experience on spark A,orc, protocol burst, Should know how to design Schema evolution,implementation of Apache center security, Should be strong in metadata manager, data lenience, data prominence.
Hello, I need run x20-25 chrome (opera) browser and start my extension , but my pc onlys lets x10 because of cpu usage. When I open ...(opera) browser and start my extension , but my pc onlys lets x10 because of cpu usage. When I open x20 , its getting slower and extension doesnt work properly. I need a way to reduce my cpu usage of browser Thank you.
...schema evolution concepts, data provenance etc Job Description: we are looking for a data governance developer Should have good knowledge of schema, evolution, data leniance, hadoop java spark, experience on spark A,orc, protocol burst, Should know how to design Schema evolution,implementation of Apache center security, Should be strong in metadata manager
...format. I would like to find out what the Maximum value of speed in a file and mean speed for each borough. The project is not very complex, but it must be done using Hadoop (Big data) and Map Reduced code should be provided. I attached Codes for weather dataset. Mine should be similar but instead Station ID in mine would be borough and temperature for
We are currently trying to make our page load faster. The site se...up the database, enabling caching, updating php, optimising images and updating wordpress but nothing seems to be working. Currently looking for someone who could help us to reduce this number of requests I have attached screenshots with all of the HTTP requests to give a better idea.
...results (the final centers and the assignments for each centers). After you’ve finished the first question, visualize your result using a 2-d plot. Testing to be done using hadoop NOTE: the deadline for this project is tuesday 11/25/18 EST 10:00 am...
Hello Trainees, ...Trainees, I am a Big Data Hadoop Trainer. Currently some batches is going on in Concept Solutions Pvt. Ltd at Indore. I am providing the Big Data Hadoop training as Job Oriented with one trending project on Big Data Hadoop. So whom are interested to become a DATA SCIENTIST, just let me know. Thanks Moin Khan Big Data Hadoop Trainer
I have k means map reduce code which runs for 2 clusters but I want to make changes to get it work for different k values. I need help to get it worked.
I am looking for someone who is expert in Hadoop Map Reduce, Machine Learning and Python/java to help me in Loan default prediction with Map Reduce using any machine learning model and Java/Python programming language in Hadoop cluster. Loan default prediction is to predict whether the customer is going to pay the loan back or not based on the data
Website: [login to view URL] The current page loading time for our website is 12-17 seconds, it needs to be reduced to less than 5 seconds. Our website is currently running on Magento CE 2.1.9. Please do not just put in a quote with the comment "I can do it." Please make sure that you FULLY UNDERSTAND the scope of work required to undertake this project as we will pay the amoun...
We need a single dedicated part time resource on Hadoop, Python(expert), Pyspark,AWS, Nifi (optional) to give support for US client on weekdays morning around 90 minutes IST 6 00 am to 8 00 am will provide 22000 per month minimum 4+ years of experience candidates only eligible for the bid.
My work is related to Medical Images Like MR...detect and extract brain tumor. Due the large scale size of those images the storage and processing becomes cumbersome task. So my proposed work is to store those images in HADOOP HDFS and apply SVM algorithm to classify tumor whether it is Benign and Malignant tumor. so I want a developer who code for me
...other students) and eliminates unauthorized access with the help of authentication feature. ¥ It eliminates errors in time and attendance entries. ¥ The system will use map-reduce/Hadoop services to generate reports for both Admin/Instructors/Students. ¥ It addresses the absence situation. It sends reports when the student is absent for more than a week
need to reduce my wordpress databse size now is 2gb to 500mb without effect website
Hello Hadoop Expert: I have a project that involves the following technologies. See below. Hadoop, Pig, Hive, Sqoop, Oozie, Spark. I need someone to help me with my project and through that, I would like to get a better understanding of the above technologies. You do not have to be an expert in ALL of them. If you know some of them and can help me
[login to view URL] for a data engineer resource with minimum 2+ Yrs of experience in Hadoop. [login to view URL] 4 Years of experience. [login to view URL] is an immediate requirement [login to view URL] required for 1 month [login to view URL] can work on this remotely
1. You need to add Alt and ti...add Alt and title text to 300x images in Wordpress, always same structure: "PAGE TITLE Hautpflege Marke Logo" 2. You need to remove all exif data of pictures 3. You need to reduce the file size of 400x pictures: make sure that images are compressed so they are served in the smallest file size possible. Timeline: 1 days