Apply statistical modeling, data transformation, data mining and machine learning to develop predictive models, attributes, alerts and indicators to improve Identity Access Management and Point of Contact processes for insurance carriers. Execute database queries to extract and aggregate LN’s data assets into attributes used for predictive modeling, reporting and analysis. Analyze client data to prepare for analysis. Identify data quality concerns and opportunities (i.e. performance indicators, scores, approval/declination indicators, etc.) and clearly communicate these concerns to stakeholders. Explore and mine new data sources to help optimize and validate existing models. Review product results and communicate findings to internal and external stakeholders. Ideate, research and design new analytics and data science methodologies on large scale and complex data assets. Store, archive, index, transfer and analyze large and complex datasets (structured and unstructured data). Create and enhance Python/R or GUI-based reporting tools that allow team members to gain insights about LexisNexis data assets. Enforce data quality testing best practices. Perform other duties as needed.
REQUIREMENTS:
- Bachelor’s or foreign equivalent degree in Computer Science, Data Science, Information Systems, or a related field required.
- 3 years of experience in job offered or related occupations required.
- Also required is: 2 years of experience in advanced ECL Programming (HPCC Systems); with Record Linking for Insurance Header files; creating product dashboards using PowerBI; and batch LexID assignment for customer records.
- Employee reports to LexisNexis Risk Solutions, Inc. office in Alpharetta, GA, but may telecommute from any location within the U.S.
- Experience can be concurrent.