An odyssey into the world of concurrency
The Editorial Board - Teamware Solutions

An odyssey into the world of concurrency

As developers, our lives are dull and lack the thrill and adventure we might have experienced in any other profession. Let us narrate an experience that Ananya experienced a week ago, in her own words which will thrill us enough that a boring C# program to read data from a database feels like a battlefront—the taste of sweet victory and the repercussions of such a win everywhere.

It was a mild sunny evening in the bustling city of Chennai with a long weekend around the corner. Many city dwellers have already vacated to either their hometowns or nearby weekend getaways. leaving it back for to the likes us. The early part of any career in the modern competitive life always means nil social life. It is even more as a developer. An already crowded market space where not only humans; these days but artificial intelligence agents are also in the lurch for coveted software developer titles. Like any other developer after a few years, I too did not feel like I was new to the task. I felt top of the pecking order after half a decade to count as my experience. But like nature, the strange complex world of logic has its own way of putting us down to our rightful place where the recent battlefront of reading data from an SQL server made me feel nothing short of a developer booting her career up.

The saga

After an eventless standup call the Architect spells out the roadmap. Like any other company, it had components of generative AI. The influence of buzzwords in our day-to-day life and such words sneaking into our humble backyard of standup calls are like hand-to-hand skirmishes. Like the one you would have in the mess hall with fellow bunkmates after a pointless argument. I was as bored as I could ever be. On top of that, I was not good at doodling! I wish I could add this as a goal for this year’s appraisal. Anyway, amongst these thoughts I felt I heard my name. Like a commander calling a volunteer to grease the motor components of a tank to be battle ready my Architect called me to volunteer for a grunt work. After nearly half a decade to count as experience, I felt I was not cut for such grunt work. But, the pressure to have a timesheet code to book hours in the week made me willing to volunteer for the grunt work. 

Did I not spell what was the work? My bad, it is so dull that I could not stop myself from calling it grunt work. Frustration apart – The task was to query a database from a C# application. Nothing fancy about it, right? The only glimmer of challenge I felt was there in that task was the database to query had file tables. It had a table. The one where a varbinary column housed a file’s bytes. The roadmap had milestones marked for the C# application to later expand as an observer which applies a few logic filters to pick files. Such files are cloned into the foreign estates of cloud storage.

First draw

The dullness within me and my utter neglect of real challenges made me write a console application that did just this

using Dapper;

using System;

using System.Data.SqlClient;

using System.Collections.Generic;

namespace DapperExample

{

    class Program

    {

        static void Main(string[] args)

        {

            var connectionString = "Did you think I will leave the key to our kingdom here?"; 

            var query = "SELECT * FROM Files";

          using (var connection = new SqlConnection(connectionString))

            {

                try

                {

                    List<File> files = connection.Query<File>(query);

                    foreach (var file in files)

                    {

                        Console.WriteLine($"File ID: {file.Id}, Name: {file.FileName}, Size: {file.FileSize}");

                    }

                }

                catch (Exception ex)

                {

                    Console.WriteLine($"Error: {ex.Message}");

                }

            }

        }

    }

    public class File

    {

        public int Id { get; set; }

        public string Name { get; set; }

        public long Size { get; set; }

        public byte[] Content { get; set; }

    }

I carry a personal dislike for the word Information Technology for our profession. If I had my way, I would happily write it Software * everywhere than spelling Information Technology. This sunny evening, I was reminded of this personal dislike again. In the profession of Information Technology, information always flows late, comes out of shape and often never arrives! I was about to send the mail to my boss calling, job done let me have some social life. The azure-coloured rounded rectangle with the smiling face of my boss pops up at the right bottom corner of my monitor. That rectangle bore the prodigal information – The keys to the kingdom. The connection strings.

The perennial; twist

My neglect of real challenges came to bite me. Not that I was unprepared. It did come as a surprise particularly when the mail claiming it to be a dusted task was in the draft. I learnt that the database was trailing the production by T-5 days; had 67,000 records in the Files table. Now if we had to lay the foundations of the ambitious goals of the roadmap, we would have to think of a different strategy. What happens if the data grows from 67,000 to say 80,000? Are we going to call back the release and make a new one for say 81,344? Then what happens when the Files count grows to say 1,00,000? Here is where I believe the real challenge rested. Probably this is why my boss asked me to take a stab at this.

Let me spell the risk. As the rows grow it is quite natural that the memory occupied by the File when the program runs will grow. Since we are loading it all in a List. The memory occupied by such objects will not be ready for collection by the vultures of CLR world – Garbage collector. The growth rate is also variable here if you notice. The File metadata like its Name, Id and Size are relatively small but it is possible that Content which is a byte array could grow exponentially say within a File count of 100. Thus, making it impossible to parse even the 67,000 that we know as stands today for volume let alone the lakhs for the future.

Second draw

I tapped into the world of async programming. Not that I am opening a fantasy land to you. It is nothing new to you. However, the application of a solution, the real gravity of the solution to a potent problem is the real thrill we developers thrive upon. So, here is my stab still not final but a shot more worthy than the first code-and-forget piece.

        public static async Task Main(string[] args)

        {

            await foreach (var fileRecord in StreamFileRecordsAsync())

            {

                Console.WriteLine($"File ID: {fileRecord.Id}, Name: {fileRecord.Name}");

            }

        }

        public static async IAsyncEnumerable<File> StreamFileRecordsAsync()

        {

            using (var connection = new SqlConnection(ConnectionString))

            {

                await connection.OpenAsync();

                var query = "SELECT * FROM Files";

                file = await connection.QueryAsync<File>(sql, commandType: CommandType.Text, buffered: false));

                yield return file;

            }

Yey! We are now streaming data. The big change is the use of async…await keywords, the option of buffered: false to the QueryAsync method and then the use of yield with foreach waiting for the whole operation to finish.

The flip side you ask? I will be a novice to say there is none. For starters what about connection timeouts?

Aha ha, got you. You did not see this coming, right? We will park ourselves in this nice spot of the mystery of what I did next to jump over that yet un-seen challenge for the next dispatch.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics