

About EDUBRICZ TECHNOLOGIES LLP
About
Connect with the team
Company social profiles
Similar jobs

As a Campus Coordinator, you are expected to perform following tasks:
1. Efficient and effective management in collaboration with candidates, campus hiring managers and T&P team, including communication and calendar management.
2. Data Management, schedule meetings, interviews, HR events and maintain agendas.
3. Managing & coordinating campus recruitment activities.
4. Manage candidate grievances & doubts.
5. Responsible for Pre-joining engagement, on-boarding and documentation of campus joiners.
6. Issuance of Offer/Appointments letter post finalization of candidates.
ONLY MALE CANDIDATES WHO ARE LOCATED IN DELHI NCR AND GURGAON ARE REQUIRED.
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus

Minimum 2 years of MERN development experience.
Strong skills in working with Node.js and MongoDB frameworks/ libraries. Ability to lead the entire product development from inception to completion.
Essential Skills -
Web/ Mobile development experience:
3-5 years of tech experience of having built scalable web/mobile applications.
Experienced in - NodeJs and its libraries MongoDB, Redis, Firebase Javascript frontend framework/libraries like Angular JS, React
Building new RESTful services and APIs
Experienced in working with AWS services like S3, EC2, and Devops GitHub, Docker
Essential Qualities :
Self starter wanting to learn new technologies, tinker, experiment and implement new products. Wanting an entrepreneurial and cross functional experience. Motivation to mentor younger incoming talent and lead projects.
WALK-INTERVIEWS FOR DATA ENTRY EXECUTIVES!!!
Selected Candidates will get immediate joining.
Note: Only interested candidates can Walk-in for Interviews
Job Location - Visakhapatnam
Open Positions: 80 (Please pass this Message to all your Friends / Relatives who are looking for a Job and also share in your whatsapp Groups.
Interview Timings: 11 AM to 2 PM
Interview Date: Interviews Happen every day (Monday – Friday)
Please mention my name on the resume while coming for an interview.
Industry Name: US Title Insurance
Company Name: IDA Automation Pvt, Ltd
Job Type: Data Process Executive
Skill Set Required – Net Speed 30 – 35 wpm & 95% accuracy is Mandate.
Language – Basic English Comprehension which mean person should able to speak and understand the language.
Qualification –Under Graduates Any Graduates & (Inter / Diploma/ ITI)
B.Tech 2013/2014/2015/2016/2017/2018 is eligible.
Note: Post Graduation are not eligible.
Job Timings: 2 shifts (Day 7 Am – 6 Pm which mean in this bracket 9 hours they need to work & Night (9 PM – 6 AM)
Food / Transport and Accommodations are not provided by Company.
Gender: Both Male & Female are Eligible. (Night Shift is Mandate for Both)
Age: 30 or Below
CTC per annum): 1, 43000 (Take Home: 9,382)
Night shift allowance: 1200 (Approx)
Six days working
No Indian Holidays.
Job Type: Full Time (Work for Home is not available)
* Carry all the documents Xerox Copy (Resume/10th/Inter/Aadhar/Pan Card) while coming for the Interview
Note: We don’t Charge anything from the candidates.
Please walk-in directly to below venue with all documents. (No Call/ No MSG)
Contact Persons: Sarat Kumar
Landmark: Opp to Wipro
Interview Venue:
IDA Automation Pvt, Ltd
Gate No: 2, First Floor,
Tech Mahindra Hub/ Tech Smart Building,
Satyam Junction,
Visakhapatnam - 530013





Aviso is the AI Compass that guides Sales and Go-to-Market teams to close more deals, accelerate revenue growth, and find their True North.
We are a global company with offices in Redwood City, San Francisco, Hyderabad, and Bangalore. Our customers are innovative leaders in their market. We are proud to count Dell, Honeywell, MongoDB, Glassdoor, Splunk, FireEye, and RingCentral as our customers, helping them drive revenue, achieve goals faster, and win in bold new frontiers.
Aviso is backed by Storm Ventures, Shasta Ventures, Scale Venture Partners and leading Silicon Valley technology investors
What you will be doing:
● Aviso is in the process of building a highly scalable, and highly performant upgrade to its industry-leading AI product. This requires a complete rethink of the base architecture of how data is stored and accessed using persistent Databases like MongoDB, PostgreSQL and Redshift.
● As part of this far reaching engineering goal, This role will be primarily responsible for the design and development of said re-architecture, working with all other parts of the Engineering organization.
Details below:
○ You will be responsible for designing the most optimal data schema architecture
to handle billions of rows of information, accessible in real-time, both for the
purposes of enabling our Machine Learning team, as well as other engineering
teams presenting complex analytical functionality directly to customers.
○ You will be responsible for designing the most optimal physical database
architecture to scale reliably with business growth, while optimizing cost.
● You will be working with our platform team to create the Service Oriented Architecture needed to create a highly redundant, fail-safe and responsive end customer facing service.
○ Needed is a solid working experience and understanding of the AWS
environment, including VPC, EC2, EBS, S3, RDS, SQS, Cloud Formation, NAT
Gateways, Lambda will be needed in order to achieve this.
● You will also own the definition and implementation of enterprise grade security using their skillsets around LDAP integration, security policies, and auditing in a Linux/AWS environment
● Additionally, You will be responsible for designing the Continuous Integration and
Continuous Delivery (CI/CD) platforms to enable all of engineering to deliver code faster with better quality.
○ In this, they will be working daily with the QA and engineering teams to enable
unit tests and automation tests to increase code test coverage.
○ Using their experience and great understanding of DevOps automation -
Orchestration/Configuration Management and CI/CD tools (Puppet, Chef,
Jenkins, etc.), they will identify the right set of CI/CD tools needed to make this a
success.
What you bring:
● Minimum 10-15 year experience in database architecture and management
● A Degree in Computer Science from a top University or equivalent
● Strong and relevant experience building and maintaining a high performance, high volume SaaS solution.
● Industry leading experience in managing petabyte scale databases.
● Solid working experience and good understanding of the AWS environment, including VPC, EC2, EBS, S3, RDS, SQS, Cloud Formation, NAT Gateways, Lambda and Redshift.
● Great understanding of DevOps automation - Orchestration/Configuration Management and CI/CD tools (Puppet, Chef, Jenkins, etc.) is required.
● Experience implementing role based security, including LDAP integration, security policies, and auditing in a Linux/Hadoop/AWS environment. We expect hands on experience with monitoring tools such as AWS CloudWatch, Nagios or Datadog.
● Networking : Working knowledge of TCP/IP networking, SMTP, HTTP, load-balancers (ELB, HAProxy) and high availability architecture.
● Strong experience in continuous integration, builds automation, configuration
management, code repository, performance engineering, application monitoring, system monitoring, management and deployment automation.
● Strong knowledge of Unix systems engineering with experience in Ubuntu or Red Hat Linux.
● Programming: Experience programming with Python, Unix scripts
Aviso offers
● Dynamic, diverse, inclusive startup environment driven by transparency and velocity
● Bright, open, sunny working environment and collaborative office space
● Convenient office locations in Redwood City, Hyderabad and Bangalore tech hubs
● Competitive salaries and company equity, and a focus on developing world class talent operations
● Comprehensive health insurance available (medical) for you and your family
● Unlimited leaves with manager approval and a 3 month paid sabbatical after 3 years of service
● CEO moonshots projects with cash awards every quarter
● Upskilling and learning support including via paid conferences, online courses, and certifications
● Every month Rupees 2,500 will be credited to Sudexo meal card

