Hi, We are interested to hire a talented Devops engineer for our Hyderabad location on permanent roles. We are looking for skill combination of; Linux/Unix Administration Container technologies like Docker and Kubernetes Cloud automation tools and technologies like Chef, Ansible, Terraform, etc. AWS, including setting up VPCs, subnets, routing, Direct Connect, VPC Peering, security groups, IAM roles and policies Automation/configuration management using SaaS-based CI/CD services (TravisCI, etc.) A wide variety of open source technologies and cloud services (experience with AWS is required) SQL and NoSQL Python programming experience Best practices and IT operations in an always-up, always-available service Agile/Scrum work environment and development process If interested with the above combination, please share the below details; Total Experience: DevOps Experience: Kubernetes Experience: Docker Experience: Cloud Network Security Experience: AWS Experience: Jenkins/TravisCI Experience: Microservices Experience: NoSQL Experience: Current CTC: Expected CTC: Notice Period: Reason for Change: Current Location: Preferred Location: Hyderabad-Y/N, mention valid reason for why Hyderabad. Thanks & Regards Rohit
We are looking for a top-notch Development Engineer With Java & Ruby On rails Or just Ruby on Rails Experience w/ BigData and Strong RDBMS Skills to develop an analytics infrastructure that will generate insights into customer experiences on our products.You will get your hands dirty on the latest and greatest platforms and frameworks. Position: Sr. / Principal SDE -Software Development Engineer (8 to 12 Yrs); Key Skills: Java, Ruby on Rails, RDBMS - Oracle / Any other Pl/SQL platforms, BigData Key Qualifications: Excellent coding skills in Java & Ruby on Rails Passion in Programming Technologies, Polyglot is what we'd look for! Basic Algorithms and Data Structures, Building Logic, Logical Programming Strong Object-oriented programming and design skills, preferably in Java / Ruby on Rails Hands on RDBMS Experience Experience in Oracle preffered Any - Big Data Experience will be plus Excellent analytical and problem-solving skills, oral and written communication skills Experience building scalable, reliable, distributed Unix-based systems with Big Data processing technologies. Job Summary: In this position, you will be working with the best minds in the industry, solving unique and challenging projects, high volume, low latency, highly available, transactional, distributed computing system design problems and automation of these systems using state of the art technologies for various platforms mobile and web. Passion for software development and quality is a must!
Greeting from Unify Technologies! We are looking out for some great talent with hands-on Scala (or Python or Java) Programming with Spark and BigData for one of our top projects which we are working with one of the Top Product Development company in Technology. We are sure that this experience will help scale-up your career as well! PFB the JD for your quick reference. What are we looking: We are looking for candidates who have a keen interest in security, privacy, scalability, and performance, cater to customer experience and pay attention to details. You’ll be part of the team that develops Software, builds automated tests, does release and reliability engineering for extraordinary frontend and backend systems scaling to billions of users and devices. Key Qualifications: Scala Programming or Excellent Java Programming including Web Services Hadoop Big Data Development - size 500 to 800TB Individual contributors with strong coding skills Location of work: Hyderabad Member of HackerRank, HackerEarth, Stackoverflow et al. Proficiency with Big Data processing technologies (Hadoop, Spark, Oozie). Experience in building data pipelines and analysis tools using Scala, Java, Python Experience building large-scale server-side systems with distributed processing algorithms. Aptitude to independently learn new technologies. Strong problem solving skills Excellent oral and written English communication skills Our Company: Unify Technologies Our Website: http://unifytech.com/ Linked In: https://www.linkedin.com/company/9206998 Offices in: Gurgaon, Pune, Hyderabad - India, and Seattle-USA Industry/Domain: Cloud/Product - Cloud Automation, Data Engineering, Mobile Few words about Unify Technologies: Unify is a pioneer in developing technology solutions towards imparting greater value and creating collaboration amongst global businesses. Unify leads the way in changing the conventional wisdom to assure greater returns on investments made. Unify helps customers focus on their business while taking care of your software needs with a global strategy to transform their company. Employment Type: Full-Time Joining time: Immediate to 30 days Work Location: Hyderabad - India Education: Bachelor's degree or equivalent in Computer Science and others related fields from reputed colleges Job Summary: A Job at UNIFY is Inspired and Innovative. If you enjoy working on unique and challenging problems? Our Project - Apple’s Enterprise Technology Services (ETS) needed engineers to be part of internet-scale systems and platforms that power all of Apple’s enterprise applications and customer-facing products including iCloud, iTunes, Retail and Online stores. Kindly check below detailed JD and company details. Let us know your interest in pursuing this position. Thanks, SudhEe Sudheendra Srinivasan Lead Recruiter | 91 333 73693 unifytech.com
Bachelor’s or Master’s degree in Computer Science/Engineering/MIS, Math or related field or equivalent experience Minimum of 7 years enterprise IT application experience that includes at least 2 years of hands-on software development, and systems architecture with healthcare(EMR/EHS) experience. 5+ years design/implementation/consulting experience with large commercial / enterprise applications Experience in product definition, solution architecture, technical leadership, conducting workshops, stakeholder meetings, determines functional requirements and drives prioritization Attention to details and excellent problem-solving skills Excellent analytical skills Intermediate level knowledge of cloud services Knowledge of healthcare domain is a plus Demonstrated experience leading or developing high quality, enterprise scale software products using waterfall and agile methodologies Experience in agile development a plus Excellent communication skills - both verbal and written as well as listening skills Proven track record of managing as-is and to-be analysis Broad technology architectural perspective and experience Understanding of emerging technology trends Technology and business transformation experience Professional executive demeanor; decisive with highly versatile interpersonal skills Brings a personality that minimizes conflict and drives positive discussions, collaborates effectively and is inclusive of disparate opinions. Emerging technology experience (AI, ML, NLP, Cloud, containers, etc) Highly organized, have multi-tasking skills, and efficient in ambiguous situations Anticipate roadblocks, escalate situations appropriately and suggest various approaches to overcome
Passion in Data Engineering, Programming Technologies Basic Algorithms and Dara Structures, Building Logics, Logical Programming Strong Object-oriented programming and design skills, preferably in Java or Scala or Python with Spark Excellent analytical and problem-solving skills, oral and written communication skills Experience with Spark or other distributed computing environments. Experience building Scalable, reliable, distributed Unix-based systems with Big Data processing technologies (Hadoop, HBase, Cassandra, other NO SQL solutions). Experience designing and implementing data ingestion and transformation for big data platforms. (Spark, Sqoop, Oozie, Kafka, HDFS, Cassandra, PIG, Impala etc.) Proven track record designing highly parallelized data ingestion and transformation jobSr. SDE - Senior Software Development Engineer (4 to 8 Yrs)s in Spark including Spark Streaming. Production experience working with Apache Spark clusters. Should be familiar with concepts like Time Complexity, Distributed Computing Hands on with Big Data Development technologies in huge environment like 500 to 800 GB/TB/PB