Recently the questionNumber of projects per solutionwas asked on reddit, which led to some interesting discussions. Of course, the answer depends a lot on the overall size and business of the application. In this post, I'll review different codebases to see what architectural choices industry leaders are making.
Before going through the case studies, let's first remember a few points:
Points to consider when partitioning .NET code
- Impact of refactoring:Having your code defined in multiple solutions can significantly slow down the daily refactoring process, as popular refactoring tools (Visual Studio Refactoring, Resharper...) work within the confines of a single Visual Studio solution. There's room for a hybrid approach: use smaller solutions for the most part, but create one that covers all projects when big changes are needed. But it means an additional solution that must be maintained.
- Construction time:When working with a large enough codebase, compile time can become an issue as compilation is often triggered to run both manual and automated tests. Having incremental builds, where only projects affected by changes are rebuilt, is very useful. But sometimes, to get an acceptable build experience, some projects need to be manually unloaded or truncated with some Visual Studio solution filters..slnfHowever, this will degrade the refactoring and browsing experience. More details on this point can be found here in a related article I wrote recently.Improve Visual Studio build performance.
- Approach:Having a few projects across multiple solutions can help enforce separation of concerns and reduce build time, and may be suitable for having multiple teams with a narrower focus and well-defined service boundaries. The last case study in this article shows an application with more than 1,600 projects and more than 200,000 lessons: In such a situation, nobody can develop without several solutions.
- Reference to cross solutions:A disadvantage of multiple solutions is that you must reference the DLL from other solutions rather than referencing projects defined in the same solution. DLL referencing is a more fragile approach that breaks when the project's output location changes. In such a situation, NuGet is here to reference projects as components of other solutions, but this introduces some additional features that need maintenance.
- Avoided IDE design cycles:All .NET IDEs detect and prevent dependency loops in the project's dependency diagram. This speaks of many detailed designs to avoid lawless structures.in large projects. Unless some rules allow classes to overlap properly in large projects. A modular approach is needed to build an application and it makes you questiondefinition of a Components: a reuse unit, a development unit, a feature unit, a release unit, a test unit, a build unit?
- The physical nature of the projects:Typically, each project is compiled into a DLL or EXE assembly file. These are physical artifacts, and having tens or hundreds of DLLs present can cause versioning, deployment, and maintenance issues. Therefore, when creating a new project, it is worth asking if there is anyphysicallyReason for the need for this new project. so frequentphysicallyThe reason is: it is dynamically loaded at runtime by a dependency injection (DI) framework.
- Classes that do not run in the same process:is a good indication that these classes should be declared in different projects.
- Separation of test and application code:An example of the above point is that test code runs in test processes while application code runs in test and production processes. Therefore, it is recommended to separate your tests and application code into separate projects.
- Project as wrapping container:If a class is to be used only in the context of its main project, it should be declared as
internal🇧🇷 This class can be used thanks to tests declared in another projectInternalsVisibleToAttributeInternalsVisibleToAttribute🇧🇷 However, this attribute should not be used in the context of application projects, and when you find this need, it is an indication that some classes should be merged in the same project.
- Code compiled with different versions of .NET:To increase reusability, some code such as domain classes do well in .NET Standard projects (which run anywhere), while some infrastructure code requires .NET 6/7 projects to take advantage of the latest platform enhancements.
All these points require trade-offs between:
- A single or multiple solution.
- A few large projects or many smaller projects in one solution.
There's no such thing as a perfect approach, so let's look at the choices made by some industry leaders.
There are many diagrams in the case study sections below. All of them were generated byNDepend dependency graph🇧🇷 I'm specifying because readers ask in comments how they were generated.
clean architectureis a term coined byTio Boband it relates to project structuring principles in a way that is easy to understand and can be easily modified as the project grows. It's becoming more and more common to structure ASP.NET Core web apps. Here it isProject dependency diagramssinceJason Taylor's CleanArchitecture .NET solution template is available here on Github.
We can review the test/code separationThose onesyexamssolution paste. Furthermore, each application project represents a layer with standardized names and roles:Domain,request,the infrastructureyweb user interface🇧🇷 You can check this postClean architecture for ASP.NET Core solution: a case studyfor a detailed look at this way of structuring a .NET solution.
NopCommerce is a popular eCommerce platform for OSS projects🇧🇷 It is much larger than the previous CleanArchitecture prototype and has 28 projects in total. However, most of these projects are small plugins, and the application code is nested within a few large projects: Core, Services, Data, and Web.
- Kerncontains mostly domain and some infrastructure abstractions. In the context of e-commerce, the domain contains classes such as order, payment, store, partner, supplier, catalogue, discount, gdpr...
- servicescontains infrastructure code to implement the domains listed above (order payment, caching, miscellaneous discounts...).
- Datacontains the code for persistence.
- The netcontains ASP.NET Core code.
Therefore, NopCommerce engineers choose the approach with few large projects. However, as noted, this approach lacks the benefit of the IDE dependency loop betweencomponentsThey only work at the project level. As a result, large projects are likeNo. ServicesOrthe supercomponentwhere almost everything is confused with everything else (image below).
Such a large piece of tangled code is also called spaghetti codeobig lama ballthat is, by definition: a piece of software or a component that lacks any discernible architecture. This doesn't mean that this code doesn't work well or that a lot of effort has been put into it. This means that all 700 types are defined inNo. Servicesnot separated into layers and together form agrandeA building, development, and testing unit that cannot be easily broken down into smaller components. This situation leads to additional maintenance costs. Later I will explain how to counteract this phenomenon, because it pays to have large and coherent projects.
Structuring a web application into multiple microservices is becoming increasingly popular. The promises of microservices are:
- scalability: Less workload as developers focus on individual services rather than the entire monolithic application.
- Faster development:Faster development cycles because developers can focus on specific services.
- Enhanced data security– Microservices communicate with each other through secure APIs, potentially giving development teams better data security than the monolithic approach
- Become a “language and technology agnostic”: Because teams work independently of each other, microservices allow different developers to use different programming languages and technologies.
See OSS solution project dependency diagram belowRun aspnetcore microservices🇧🇷 We can see them too.clean architecturePrinciples applied toan orderConcern (domain, application, infrastructure).
Also the projects in itmicrosserviçosthey look less coupled than in the diagramNopCommerceDiagram (from previous section). However, some dependencies are not reported. for example the serviceBasket.APIconsummatedDiscount.APIcalling the method
GetDiscount()although your projects are notstaticcoupled. the key is thismarco gRPCis used to deal with such
GetDiscount()call (RPC meansremote procedure call) as shown in the screenshot above.
registro4Netis a popular OSS logging framework. Your code is nested in a single project and another project contains the test. In such a situation, the single project approach makes sense, as log4Net is a consistent enough framework and your customers don't want to mess with multiple assemblies, even if they are bundled into a single NuGet package. Yet here, too, a single major project led to thissuper componentCrazy. In whichregistro4NetDesign almost everything is statically dependent on everything else.
.NET base class libraries
Below is the diagram of the 166 assemblies of the .NET 7.0 preview BCL located in the directoryC:\Programa\dotnet\shared\Microsoft.NETCore.App\7.0.0-preview.3.22175.4🇧🇷 Obviously BCL is not consistent with a smaller API like log4Net. For example, not all XML-related implementations need to be loaded into memory if your application only works with JSON. So it makes sense to divide its 18,000 types (10,000 of them public) into 166 projects.
Here is the project dependency diagram of our application. We also decided to do some big projects (NDepend.CoreyNDepend.UI) surrounded by smaller projects for the different types of NDepend (Analysis & Reporting, Visual Studio Extension, Azure DevOps Extension, ILSpy Extension...). the basic designNDepend.APIcontains only abstractions and is consumed both by our code and byThird-party consumer of NDepend.APIto automatically control the main functions of the product. Some users have reported that they look at literally thousands of .NET solutions, so automation like this really makes sense for them.
Even though we have big projects, we don't facesuper componentPhenomenon because dog food itself is subject to rules such asAvoid namespaces that depend on each otheryAvoid namespace dependency cycles🇧🇷 So, within a big project, we group classes into a hierarchy of namespaces, which we think of as our components. As mentioned, there are benefits to having smaller projects: easier refactoring, easier versioning, less maintenance, less physical assets to maintain. Fortunately, the C# compiler ismuchfast and compiles the 1,400 classes ofNDepend.UIin 3 seconds on modern hardware.
The core of Roslyn is the 3x compiler projectsRoslyn.CodeAnalysis,Roslyn.CodeAnalysis.CSharpyRoslyn.CodeAnalysis.VisualBasic🇧🇷 Surrounding these projects is a galaxy of smaller projects to provide services like workspace/solution/project, parser runner, scripting, expression evaluator...
Again, the big design approach makes sense here, as a compiler is somewhat consistent: you might want to compile C# code without hosting the VB.NET compiler in memory, but you certainly don't want to use just a partial version of C#. 🇧🇷 compiler.
With over 1,600 projects and over 200,000 classes, Visual Studio is arguably the largest .NET application in the world. I don't have any inside information on the number of solutions needed, but there are certainly many. Clearly, there are a number of teams that need a narrower focus and acceptable build time. Most resources are rendered as extensions and loaded on demand. No one uses all of Visual Studio's features in a single solution, which means that most assemblies are unloaded most of the time. Also for performance reasons, Visual Studio spawns many child processes at runtime, reinforcing the relevance of several solutions.
It seems that for large enough applications, the industry prefers fewer but larger designs, and for smaller applications, guidelines such asclean architecturealso reigns themicrosserviçosThe section clarifies the benefits of this approach.
If you're wondering how to structure your next .NET solution or improve existing ones, I hope the many points covered and the real-life examples will help you make the right decisions.
If you are interested in visualizing the architecture of your .NET project, easyDownload the full-featured 14-day free trial of NDepend now, StartVisualNDepend.exe, review your solution(s) and go todependency diagramMesa.
As my father was one of the first programmers in the 1970s, I was lucky enough to switch from playing with Lego to programming my own microgames as a kid. Since then I have never stopped programming.
I studied mathematics and software engineering. In 2002, after a decade of programming and consulting in C++, I became interested in the new .NET platform. I had the opportunity to write the bestselling book (in French) on .NET and C# published by O'Reilly and I also managed some academic and professional courses on the platform and in C#.
During my consulting years, I gained experience with the challenges of architecture, development and maintenance of large and complex real-world applications. It seemed the monolithic legacy of spaghetti and mayhem would affect any team large enough. As a result, I became interested in static code analysis and started the NDepend project in 2004.
Today, NDepend is a complete independent software vendor (ISV). With more than 12,000 enterprise customers, including many of the Fortune 500, NDepend gives a wide range of professional users around the world deeper insights and complete control over their applications.
I live with my wife and our twin children, Léna and Paul, on the beautiful island of Mauritius in the Indian Ocean.