Ideas2IT is a team of startup nerds, valley veterans, CTOs, Xooglers, big co. We deliver web, app and data science app solutions to fortune 100 companies and startups.
Candidate should be able to develop windows app on his own. Backend Rest based service will be provided by the team.
he candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts
We are looking for a creative candidate who can join our team to build new products. Join us to learn more.
If you want to work in a company that is changing life as you know it then this is the place to be. We are creating Artificial Intelligence(AI) based agents that allow machines, businesses, and customers communicate with each other instantly using the help of AI. We are currently looking for DevOps Engineer to work from Bangalore location. Below is the detailed requirement : Requirement: This candidate should have 2-5 years of experience into: 1. deploying and managing multiple servers 2. hands on experience managing DB technologies like Mongo/Redis/Elastic search 3. containerization, ideally using Docker and Kubernetes 4. working in real time streaming technologies like Kafka, Kinesis etc. 5. experience in Big data technologies like Hadoop, HFS, spark etc is preferred
Asia’s pioneer Algorithmic Trading Research and Training Institute, QuantInsti provides practically oriented education on algorithmic trading using online real-time video-sharing as well as interactive learning hands-on courses to its participants. Our flagship course & Executive Programme in Algorithmic Trading” (EPAT™) has been successfully completed by participants from more than 50 countries across the globe and it continues to grow! In December 2016, we launched our own EduTech Platform, Quantra (quantra.quantinsti.com). Quantra is a self-paced learning portal that focuses on imparting quantitative & algorithmic trading concepts. This portal is positioned to be the one stop solution ‘for everything algo’! Within 6 months of its launch, Quantra has seen an exponential growth in terms of its user base, spanning to 100+ countries. With an expert in-house content team, and marketplace model for practitioners, we’re aiming at launching 50+ new courses in next 7-8 months. We’re trying to expand our team and are looking for an enthusiastic full stack developer, who brings his/her existing expertise into our systems