Senior .NET Developer

We are seeking a highly skilled .NET Backend Developer with expertise in C#, SQL Server, MongoDB, MySQL, and large-scale data processing as core skill. This role focuses on efficient data ingestion, structured data integration, and high-speed processing of large datasets while ensuring optimal memory and resource utilization.

The ideal candidate should have deep experience in handling structured and unstructured data, multi-threaded processing, efficient database optimization, and real-time data synchronization to support scalable and performance-driven backend architecture.

Key Focus Areas

Efficient Data Ingestion & Processing: Developing scalable pipelines to process large structured/unstructured data files.
Data Integration & Alignment: Merging datasets from multiple sources with consistency.
Database Expertise & Performance Optimization: Designing high-speed relational database structures for efficient storage and retrieval.
High-Performance API Development: Developing low-latency RESTful APIs to handle large data exchanges efficiently.
Multi-Threaded Processing & Parallel Execution: Implementing concurrent data processing techniques to optimize system performance.
Caching Strategies & Load Optimization: Utilizing in-memory caching & indexing to reduce I/O overhead.
Real-Time Data Processing & Streaming: Using message queues and data streaming for optimized data distribution.

Required Skills & Technologies

  • Backend Development: C#, .NET Core, ASP.NET Core Web API
  • Data Processing & Integration: Efficient Data Handling, Multi-Source Data Processing
  • Database Expertise: SQL Server MongoDB ,MySQL (Schema Optimization, Indexing, Query Optimization, Partitioning, Bulk Processing)
  • Performance Optimization: Multi-threading, Parallel Processing, High-Throughput Computing
  • Caching & Memory Management: Redis, Memcached, IndexedDB, Database Query Caching
  • Real-Time Data Processing: Kafka, RabbitMQ, WebSockets, SignalR
  • File Processing & ETL Pipelines: Efficient Data Extraction, Transformation, and Storage Pipelines
  • Logging & Monitoring: Serilog, Application Insights, ELK Stack
  • CI/CD & Cloud Deployments: Azure DevOps, Kubernetes, Docker

Key Responsibilities

1. Data Ingestion & Processing

  • Develop scalable data pipelines to handle high-throughput structured and unstructured data ingestion.
  • Implement multi-threaded data processing mechanisms to optimize efficiency.
  • Optimize memory management techniques to handle large-scale data operations.

2. Data Integration & Alignment

  • Implement high-speed algorithms to merge and integrate datasets efficiently.
  • Ensure data consistency and accuracy across multiple sources.
  • Optimize data buffering & streaming techniques to prevent processing bottlenecks.

3. High-Performance API Development

  • Design and develop high-speed APIs for efficient data retrieval and updates.
  • Implement batch processing & streaming capabilities to manage large data payloads.
  • Optimize API response times and query execution plans.

4. Database Expertise & Optimization (SQL Server , MongoDB ,MySql )

  • Design efficient database schema structures to support large-scale data transactions.
  • Implement bulk data operations, indexing, and partitioning for high-speed retrieval.
  • Optimize stored procedures and concurrency controls to support high-frequency transactions.
  • Use sharding and distributed database techniques for enhanced scalability.

5. Caching & Load Balancing

  • Deploy Redis / Memcached / IndexedDB caching to improve database query performance.
  • Implement data pre-fetching & cache invalidation strategies for real-time accuracy.
  • Optimize load balancing techniques for efficient request distribution.

6. Real-Time Data Synchronization & Streaming

  • Implement event-driven architectures using message queues (Kafka, RabbitMQ, etc.).
  • Utilize WebSockets / SignalR for real-time data synchronization.
  • Optimize incremental updates instead of full data reloads for better resource efficiency.

Preferred Additional Experience

Experience handling large-scale databases and high-throughout data environments.
Expertise in distributed database architectures for large-scale structured data storage.
Hands-on experience with query profiling & performance tuning tools.