Page 3

100 days of Duolingo

A few days ago, I reached an exciting milestone. 100 days of Duolingo.

So for the past few months, I have been learning Dutch. Graduation is coming up for me, so I am thinking about various career pathways. And since last year, I caught Dutch fever. I thought it would be interesting to work in the Netherlands someday.

The last 100 days have gone by so quickly. Practising daily Duolingo has been a memorable journey so far - a core memory is grinding it during my rural rotation. I remember practising it on the Vline train on the commute to my rural placement, and practising late at night in the accommodation.

Practising Duolingo has been a delight because the company has invested quite heavily in the interface. Prior to learning Dutch, I was only familiar with the memes about Duo the Owl menacingly threatening you to maintain the streak. But the company lives up to its reputation - the animations, the haptic feedback and overall cohesiveness of the different screens blew me away back then and blows me away 100 days later. As an aspiring developer, the interface is something worth admiring and trying to emulate. They have even really nailed the gamification aspects of the app.

However, it is not without its flaws. I’m not complaining about its freemium model, but I’m annoyed how some ads don’t load correctly after completing a lesson. This results in my having to evoke either the home screen or swipe down to bring the Notifications Center. Additionally, I think the speech-to-text interface can be improved. While it is a novel idea, there have been too many occasions where the app cannot pick up what I’ve said. As a result, I’m unsure whether my pronunciation is actually accurate and, though speaking is an important part of learning a language, I try to avoid practising it.

My last critique is the learning content of Duolingo. A core part of the learning experience in Duolingo is just constant exposure, which can lead to a lot of grinding. This is good, but I’m disappointed that Duolingo only tries to teach by showing vocabulary. I wish that there was a quick preview moment where the app demonstrates how the grammar works before exposing learners to the vocabulary. I’ve made many mistakes where my grammar was incorrect and it was difficult to understand why. Luckily, in these cases, I reached out for a textbook which helped a lot. But without additional resources, I would’ve struggled a lot more and for no reason.

Nevertheless, I’ve been enjoying learning Dutch and Duolingo has been a big part of it. I tried to initially learn Dutch last year by downloading an Anki, but it was quite daunting and difficult to maintain that habit. So I think Duolingo is a great way to break down any daunting barriers of learning something new. Starting anything is uncomfortable and difficult, and I appreciate how Duolingo is essentially a framework that sets up safe guard rails for ensuring that you stay on track. I look forward to keep practising and hopefully provide another update at the 200 day mark.

A Small Update

Today is a unique day - it only comes around every leap year. Hence I thought it was appropriate to post a little update.

My life generally is divided into 3 responsibilities - studying dentistry, app development and trying to develop a startup idea.

I am in my final year of the dental program. I have been seeing patients almost everyday since mid-January. It’s very rewarding and provides insight what a post-graduation life looks like. There are 7 rotations before graduation, and I am on my 2nd. My 2nd rotation is a rural placement, so I am located 2.5 hours from Melbourne.

After my last post last year, the assessments ramped up and had little time for app development. At the same time, iOS17 was released which meant there were a lot of API changes to SwiftUI. It took some time to catchup and understand the changes, as well as the new frameworks introduced. During the holidays, I have been working on a separate app as well. I am currently working on that whenever I have the chance.

Also last year, I wrote about CBCT following a conversation with a radiologist. Since then, I have been talking to a lot of dentists about the idea and I have a general concept for a startup. Currently, I am in the process of assembling a team and developing a business plan. There are some milestones I’d like to achieve within this year for the startup to actually materialise. I shall post updates as I hit them.

2024 is all about biding time and putting in the hours. Let’s hope it all pays off

post

Weeknotes № 4

It’s been a long time since I wrote anything.

There’s a reason for this. I can explain. Let me explain… Oh wait, sorry were you waiting for me to explain?

I had a university 1 week break from August 14th - 18th. Perfect time to polish the app and launch it for beta. I programmed every single day. I even pulled all nighters. I barely launched for beta but bugs would pop up everywhere.

Carry on to the week after. I had a week long placement that was a 2 hour commute each way. I was already tired from the week before and the placement was draining. But I kept pushing.

Suddenly, it’s the first week of September. I manage to submit to App Store, but get rejected because of in app purchase issues. I also took the time to think about the purpose of my app. I realised that I added too many features and competing in the wrong market. I tentatively begin a redesign.

This week I’ve been working hard on the redesign. However, I am starting to realise I am running out of time. Financially. Mentally. Physically. I’ve been pushing myself to the brink trying to manage coding and studying at the same time. It’s getting to a point where it’s becoming infeasible for me.

So essentially, I haven’t been posting because every week I thought I can deliver soon. However, soon has stretched to an almost a month.

It’s time to wrap things up. Exams are coming up so I will start studying for that. I plan to keep the app minimalistic before the launch and start adding more features after exams.

post

Weeknotes № 3

This week was busy, as per usual. The app is getting closer and closer to completion, but that’s something I need to remind myself everyday.

I think this is the curse of the creator: I can only see the bugs and the lack of features in my app. However, if you really look at it, I achieved quite a lot in my app. It is quite usable. The only reason why I’m so harsh on myself is because

I implemented a few new features this week, but mostly worked on fixing bugs. I detected a serious bug in user ordering so I’ll have to add a redux to the article. There is an issue in the validation code (aka the re-insertion) so I’ll fix that before the beta launch.

I’m entering Week 6 of development, so at this point in time, I’m working off more desperation and fumes rather than passion. This is close to being done, and I’m so excited for it.

post

Enums for Managing Multiple User Actions

This is a really interesting article for building large scale SwiftUI apps. There is some sage wisdom here.

The lesson I particularly learnt from the article is enums. They are so handy for when you don’t want to expose your entire view. Additionally, you can avoid using View Models to power your user interactions which makes your view more reusable (something I try to achieve for building SwiftUI previews).

Here is the example that they provide in the article:

struct ReminderCellView: View {
    let index: Int
    let onEvent: (ReminderCellEvents) -> Void

    var body: some View {
        HStack {
            Image(systemName: "square")
                .onTapGesture {
                    onEvent(.onChecked(index))
                }
            Text("ReminderCellView \(index)")
            Spacer()
            Image(systemName: "trash")
                .onTapGesture {
                    onEvent(.onDelete(index))
                }
        }
    }
}

struct ContentView: View {
    var body: some View {
        List(1...20, id: \.self) { index in
            ReminderCellView(index: index) { event in
                switch event {
                    case .onChecked(let index):
                        print(index)
                    case .onDelete(let index):
                        print(index)
                }
            }
        }
    }
}
post

Weeknotes № 2

I promised that I would make a regular series of this, but I’ve been busy coding for the past few weeks. Nonetheless, I plan to release some articles after this one.

Progress on my app is going well. From time to time, I run into roadblocks which frustrates me greatly. However, I am grateful that I am able to overcome them one by one.

I’ve also been working on the business side of things. I registered my Apple Developer account, created some social media accounts for the app and registered a domain name as well. For the former, I have successfully uploaded my app to Testflight. So it’s a matter of actually polishing the damn thing so I can push for beta.

post

Weeknotes № 1

I’m trying to write more and utilise this blog more. This is the start of a weekly series writing about my development journey. Though I have clinics everyday, I probably won’t publish those adventures due to patient privacy.

My Dev Log series has been quiet, but I’ve been toiling away on something else since. I would like to start another series to chronicle the development process, but I want to have developed something more substantial before I write anything about it.

Maybe I’ll use the week notes as an opportunity to update the blogosphere on my progress. And I’ll add some interesting learnings from time to time. I really enjoyed writing about user sorting and I’m proud to say that I’ve incorporated the logic into my app. My app is somewhat usable now to the point I can start dogfooding it now.

I think I’m really impressed how much code lies behind the user interface. I’m making a todo app, but there was a lot of steps before I got to toggling the todo state. But I’m happy with this as it feels like I have a firm footing when writing code on top of it. It feels like I’m not tripping up.

It’s funny to express coding in terms of emotional experiences. I suppose coding is ultimately a creative endeavour so, much like art, it tingles our brains when our work is technically sound too. I guess it’s also being I’m working inside of a framework (ie. SwiftUI) so it feels harmonious when I feel like I’m working with the framework rather than against it.

post

User Sorting in Swift and Core Data

User Sorting is an interesting computer science and user experience problem. On the surface, it can seem very straightforward but can get very complicated depending on your implementation.

Unlike traditional sorting algorithms, the heavy work of the sorting is already solved by the database of choice. The difficult part is respecting the user’s manual reordering of items outside of their usual metadata.

There is a lot of literature on the subject. I found a blog post from 2018 where the author created a Postgres extension using fractions as the index. Another solution uses strings for the strings, which means that resorting is not required as much. I personally relied heavily on this StackOverFlow answer. The solution recomputes the index of each element after the move operation is completed.

However, in my eyes, the implementation used in Things 3 is the holy grail of user sorting. If you have a sneak peek into their local database, you can see that each record will have an index column.

Slow Learner Quest

For the longest time, I was unsure how the index was calculated. However, this blog post shed some light.

The algorithm is very simple (on paper)

  1. Create indices with sufficiently large gaps in-between. Avoid consecutive numbers as much as possible - you want to avoid recomputation.
  2. Inserting elements requires knowing the index values of the neighbours, and use the mean of their values for the new index value.
  3. In the case that the interval between elements is too close/short, grab all the offending index values and regenerate values. This can be considered a routine ‘clean up’ of the database.

My attempt included a few more constraints:

  1. Rather than calculating the average of its neigbours’ indices, the newly inserted element will randomly generate an index using its neighbours as an upper and lower bound.
  2. The bottom-most element will always have an index value of 0.
  3. I want the methods to be agnostic of the view. The view will provide me information such as the size of the array and the elements. However, I want on querying the database as I can programmatically add items to a list in the future. This will be useful if I ever provide App Intents support in the future (I most likely will no matter what).

#1 has advantages and disadvantages. I like to use big values (the max boundary is 999,999) and I found that I need to recalculate more frequently if I use the average. On the other hand, randomly generating the index value can result in duplicate values, so there is more checking required.

#2 is actually really useful because it helps reset the index values. I noticed that, after several iterations of sorting, the index values can get clustered around a certain range of numbers. Resetting the bottom-most index to 0 allows the list to become more evenly distributed again.

The initial challenge was to understand how SwiftUI’s .onMove modifier communicates the user’s interaction. The docs show that there are 2 parameters, IndexSet and Int. The former is a set of all the indices touched, while the latter is where the user tried to move it in the array. I realised that the user’s intent of either moving the item further up in the hierarchy or lower can be calculated by comparing the item’s initial index and the destination.

So based on this, my plan of attack was:

  1. When the user adds an item to the list, its index value is 0. The item before that will be recalculated
  2. The same applies if the user drags an item to the very bottom of the list.
  3. If the user moves the item further up, check its future neighbours’ values to calculate the new index value.
  4. The same applies if the user moves the item down.

After the insertion, I also run a validate function, which includes recalculateBounds and checkUniqueness. The former checks if the gaps between the indices are getting too small, so it tries to spread them out based on the boundary provided. The latter checks if items have the same index value and will try to spread them out further.

To make this easier to understand, I have provided the source code below. The github repository is also available. This code is still fresh so I may go in and tidy it up in the future.

My main motivation for writing this post is because I’m sure there are other people who have been struggling with this problem too. While my solution may not be perfect, hopefully it can provide a good launching point for others.

import Foundation
import CoreData

final class ContentViewController {
	static let shared = ContentViewController()
    private var persistence = PersistenceController.shared

    func add(item: Item) {
        item.index = 0
        var fetch = fetchLowestItems()
        if fetch.count > 1 {
            fetch.removeFirst() // discard the first item because it'll always be the one that we just added
            let lowest = fetch.removeFirst()
            if lowest.index >= 0 {
                print(fetch.count)
                if fetch.count == 1 {
                    let neighbour = fetch.removeFirst()
                    lowest.index = assignIndex(start: neighbour.index, end: 0)
                } else {
                    lowest.index = assignIndex(end: 0)
                }
            }
        }
    }

    func move(item: Item, origin: Int, destination: Int) {
        print("Origin: \(origin)")
        print("Destination: \(destination)")

        if destination == 0 {
            if let firstItem = fetchHighestIndex() {
                item.index = assignIndex(end: firstItem.index)
            }
        } else if origin > destination {
            // `move` will always insert it below the destination when trying to move
            if let parentNeighbour = fetchItem(at: destination),
               let descNeighbour = fetchDescendingNeighbour(at: destination) {
                item.index = assignIndex(start: parentNeighbour.index, end: descNeighbour.index)
                validate([descNeighbour, item, parentNeighbour])
            }
        } else if origin < destination {
            if let firstDescNeighbour = fetchItem(at: destination) {
                if let secDescNeighbour = fetchDescendingNeighbour(at: destination) {
                    item.index = assignIndex(start: firstDescNeighbour.index, end: secDescNeighbour.index)
                    validate([secDescNeighbour, item, firstDescNeighbour])
                } else if let parentNeighbourIndex = fetchIndex(at: destination - 1) {
                    firstDescNeighbour.index = assignIndex(start: parentNeighbourIndex, end: 0)
                    item.index = 0
                }
            }
        }
    }

    private func fetchHighestIndex() -> Item? {
        let request = NSFetchRequest<Item>(entityName: "Item")
        request.sortDescriptors = [NSSortDescriptor(keyPath: \Item.index, ascending: true)]
        request.fetchLimit = 1
        do {
            let item = try self.persistence.container.viewContext.fetch(request)
            return item.first!
        } catch {
            return nil
        }
    }

    private func fetchLowestItems() -> [Item] {
        let request = NSFetchRequest<Item>(entityName: "Item")
        request.predicate = NSPredicate(format: "%K <= 0", #keyPath(Item.index))
        request.sortDescriptors = [NSSortDescriptor(keyPath: \Item.index, ascending: false), NSSortDescriptor(keyPath: \Item.timestamp, ascending: false)]
        request.fetchLimit = 3
        do {
            let items = try self.persistence.container.viewContext.fetch(request)
            return items
        } catch {
            return []
        }
    }

    private func fetchDescendingNeighbour(at destination: Int) -> Item? {
        let request = NSFetchRequest<Item>(entityName: "Item")
        request.fetchOffset = destination
        request.sortDescriptors = [NSSortDescriptor(keyPath: \Item.index, ascending: true)]
        request.fetchLimit = 1
        do {
            guard let item = try self.persistence.container.viewContext.fetch(request).first else { return nil }
            return item
        } catch {
            return nil
        }
    }

    private func fetchDescendingNeighbour(below index: Int64) -> Item? {
        let request = NSFetchRequest<Item>(entityName: "Item")
        request.predicate = NSPredicate(format: "%K > %ld", #keyPath(Item.index), index)
        request.sortDescriptors = [NSSortDescriptor(keyPath: \Item.index, ascending: true)]
        request.fetchLimit = 1
        do {
            guard let item = try self.persistence.container.viewContext.fetch(request).first else { return nil }
            return item
        } catch {
            return nil
        }
    }

    private func fetchParentNeighbour(above index: Int64) -> Item? {
        let request = NSFetchRequest<Item>(entityName: "Item")
        request.predicate = NSPredicate(format: "%K <= %ld", #keyPath(Item.index), index)
        request.sortDescriptors = [NSSortDescriptor(keyPath: \Item.index, ascending: true)]
        request.fetchLimit = 1
        do {
            guard let item = try self.persistence.container.viewContext.fetch(request).first else { return nil }
            return item
        } catch {
            return nil
        }
    }

    private func fetchItem(at destination: Int) -> Item? {
        let request = NSFetchRequest<Item>(entityName: "Item")
        request.fetchOffset = destination - 1
        request.sortDescriptors = [NSSortDescriptor(keyPath: \Item.index, ascending: true)]
        request.fetchLimit = 1
        do {
            guard let item = try self.persistence.container.viewContext.fetch(request).first else { return nil }
            return item
        } catch {
            return nil
        }
    }

    private func fetchIndex(at destination: Int) -> Int64? {
        guard let item = try? fetchItem(at: destination) else { return nil }
        return item.index
    }

    /// Checks whether there's enough space for the items to be differeniated
    private func validate(_ items: [Item]) {
        recalculateBounds(items)
        checkUniqueness(items)
        cleanUp()
    }

    private func recalculateBounds(_ items: [Item]) {
        if let upperBound = items.max(by: { $0.index < $1.index }),
           let lowerBound = items.min(by: { $0.index < $1.index }) {

            if abs(upperBound.index - lowerBound.index) < 10 {
                var newUpperBoundIndex = fetchParentNeighbour(above: upperBound.index)?.index ?? -99999
                if abs(newUpperBoundIndex - upperBound.index) < 100 {
                    newUpperBoundIndex = -99999
                }
                var newLowerBound = fetchDescendingNeighbour(below: lowerBound.index)?.index ?? 0
                if abs(lowerBound.index - newLowerBound) < 100 {
                    newLowerBound = fetchDescendingNeighbour(below: newLowerBound)?.index ?? 0
                }

                let range = abs(newUpperBoundIndex - newLowerBound) / 3
                print("range is \(range); lower: \(newLowerBound) - upper: \(newUpperBoundIndex)")
                print("Starting from: \(newLowerBound)")

                for index in stride(from: newLowerBound, through: newUpperBoundIndex, by: Int64.Stride(range)) {
                    items[Int(index)].index = -range * index
                }

                print("Finishing at: \(newUpperBoundIndex)")

            }
        }
    }

    private func checkUniqueness(_ items: [Item]) {
        let request = NSFetchRequest<Item>(entityName: "Item")

        for item in items {
            request.predicate = NSPredicate(format: "%K == %ld AND %K != %@", #keyPath(Item.index), item.index, #keyPath(Item.timestamp), item.timestamp! as CVarArg)
            request.sortDescriptors = [NSSortDescriptor(keyPath: \Item.index, ascending: true)]
            request.fetchLimit = 1
            if let similarItem = try? self.persistence.container.viewContext.fetch(request).first,
               let upperBound = fetchParentNeighbour(above: item.index),
               let lowerBound = fetchDescendingNeighbour(below: item.index) {
                similarItem.index = Int64.random(in: similarItem.index...upperBound.index)
                item.index = Int64.random(in: lowerBound.index...similarItem.index)
            }
        }
    }

    private func cleanUp() {
        if let zeroIndexItem = fetchLowestItems().first {
            zeroIndexItem.index = 0
        }
    }

    /// This is a dumb function. We need to write higher level functions for validation.
    private func assignIndex(start: Int64 = -99999, end: Int64) -> Int64 {
        return Int64.random(in: start...end)
    }
}
post

Task Managers are database interfaces

If you really think about it, to-dos can be laid out as database rows.

Tags, lists and other metadata are further interfaces to surface those rows.

In the macOS space, the two clear winners of task managers are OmniFocus and Things. The former doesn’t try to hide its database origins (have you seen that inspector view?!) while the latter is a more refined and beautiful version of Reminders.

However, I believe that a space exists between a polished interface and exposing metadata to the user. Additionally, OmniFocus and Things 3 have been released approximately 4 and 6 years ago, and I think that user needs in the information age have changed a lot since then. A tool needs to be created that adapts to the user, rather than the user having to adapt to the tools.

post

CBCT

Here’s something that may surprise you: apart from having a special interest in Swift/SwiftUI, I spend the majority of my other time studying dentistry. In fact, I have been doing this for 3 years now! But a healthy dose of imposter syndrome meant that I never really knew what to write about. However, I think I finally found my first topic: my CBCT Observation experience.

CBCT (Cone Beam Computed Tomography) is a medical imaging technique that provides a 3D image into the patient’s mouth. It stitches multiple images captured together to build that view and you can navigate in X, Y, Z axis. The information provided is so detailed that you can literally see everything. In fact, the information is so detailed that you can determine if a patient has an allergy or an actual odontogenic infection based on the mucosal thickening of their maxillary sinus. As a result, it’s not surprising that they are mainly used for implant treatment planning and endodontic treatment.

While going through 5 cases, the radiologist and I also talked about the implications of AI and CBCT. We thought that it would be interesting for the AI to provide differential diagnoses based on the data. I currently know that there’s one company covering this space, as mentioned by this podcast.

During the session, I started thinking whether there’s a way to read/parse CBCT files aka DICOM file format. A quick search on Github yields not much. A google search shows this systematic review, which shows that DICOM mostly remains an unstandardised format. I would like to have a holiday where I can build a rudimentary CBCT reader. It’s far fetched, but I think it would be a really interesting side project.

post