EdGE Networks offers next gen HR Technology Solutions for talent acquisition and workforce optimization powered by Data Science and Artificial Intelligence.
Experience : Minimum of 3 years of relevant development experience Qualification : BS in Computer Science or equivalent Skills Required: • Server side developers with good server side development experience in Java AND/OR Python • Exposure to Data Platforms (Cassandra, Spark, Kafka) will be a plus • Interested in Machine Learning will be a plus • Good to great problem solving and communication skill • Ability to deliver in an extremely fast paced development environment • Ability to handle ambiguity • Should be a good team player Job Responsibilities : • Learn the technology area where you are going to work • Develop bug free, unit tested and well documented code as per requirements • Stringently adhere to delivery timelines • Provide mentoring support to Software Engineer AND/ OR Associate Software Engineers • Any other as specified by the reporting authority
he candidate will be responsible for all aspects of data acquisition, data transformation, and analytics scheduling and operationalization to drive high-visibility, cross-division outcomes. Expected deliverables will include the development of Big Data ELT jobs using a mix of technologies, stitching together complex and seemingly unrelated data sets for mass consumption, and automating and scaling analytics into the GRAND's Data Lake. Key Responsibilities : - Create a GRAND Data Lake and Warehouse which pools all the data from different regions and stores of GRAND in GCC - Ensure Source Data Quality Measurement, enrichment and reporting of Data Quality - Manage All ETL and Data Model Update Routines - Integrate new data sources into DWH - Manage DWH Cloud (AWS/AZURE/Google) and Infrastructure Skills Needed : - Very strong in SQL. Demonstrated experience with RDBMS, Unix Shell scripting preferred (e.g., SQL, Postgres, Mongo DB etc) - Experience with UNIX and comfortable working with the shell (bash or KRON preferred) - Good understanding of Data warehousing concepts. Big data systems : Hadoop, NoSQL, HBase, HDFS, MapReduce - Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments. - Working with data delivery teams to set up new Hadoop users. This job includes setting up Linux users, setting up and testing HDFS, Hive, Pig and MapReduce access for the new users. - Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools. - Performance tuning of Hadoop clusters and Hadoop MapReduce routines. - Screen Hadoop cluster job performances and capacity planning - Monitor Hadoop cluster connectivity and security - File system management and monitoring. - HDFS support and maintenance. - Collaborating with application teams to install operating system and - Hadoop updates, patches, version upgrades when required. - Defines, develops, documents and maintains Hive based ETL mappings and scripts
We’re looking for an experienced Data Engineer to be part of our team who has a strong cloud technology experience to help our big data team to take our products to the next level. This is a hands-on role, you will be required to code and develop the product in addition to your leadership role. You need to have a strong software development background and love to work with cutting edge big data platforms. You are expected to bring with you extensive hands-on experience with Amazon Web Services (Kinesis streams, EMR, Redshift), Spark and other Big Data processing frameworks and technologies as well as advanced knowledge of RDBS and Data Warehousing solutions REQUIREMENTS Strong background working on large scale Data Warehousing and Data processing solutions. Strong Python and Spark programming experience. Strong experience in building big data pipelines. Very strong SQL skills are an absolute must. Good knowledge of OO, functional and procedural programming paradigms. Strong understanding of various design patterns. Strong understanding of data structures and algorithms. Strong experience with Linux operating systems. At least 2+ years of experience working as a software developer or a data-driven environment. Experience working in an agile environment. Lots of passion, motivation and drive to succeed! Highly desirable Understanding of agile principles specifically scrum. Exposure to Google cloud platform services such as BigQuery, compute engine etc. Docker, Puppet, Ansible, etc.. Understanding of digital marketing and digital advertising space would be advantageous.
www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!
Full Stack Developer for Big Data Practice. Will include everything from architecture to ETL to model building to visualization.
Bachelor’s or Master’s degree in computer science or software engineering; Experience with object-oriented design, coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures. Ability to architect highly scalable distributed systems, using different open source tools. Experience building high-performance algorithms. Extensive knowledge of different programming or scripting languages such as Python,Scala Apache Spark Experience with different (NoSQL or RDBMS) databases such as MongoDB Google Big Query , Cassandra, Elastic Search,HBASE,Data Pipelines,IMPALA Experience building data processing systems with Hadoop and Hive using Python. Good Exposure on AWS Lamba, Kingsis, EMR,Redshift, Kafka,
Responsibilities Developing intelligent and scalable engineering solutions from scratch. Working on high/low-level product designs & roadmaps along with a team of ace developers. Building products using bleeding-edge technologies using Ruby on Rails. Building innovative products for customers in Cloud, DevOps, Analytics, AI/ML and lots more.