Large collection of items

Topics: Developer Forum
Jul 16, 2008 at 2:57 PM
Edited Jul 16, 2008 at 3:00 PM

Started some testing with N2 yesterday and it looks really great. Keep up the good work :)

The story
Im trying to implement a existing webapplication that have a large collection of dogs/cats information.
Each pet got the following fields: Name, Age, TypeOfPet, CreatedDate, Status, LargeText1, LargeText2, LargeText3 and MainImage
(9 fields, some of the LargeText-fields could be pretty big)

They got about 500 items (dog/cat) in the existing database. (growing every week)

1) What is the best practice to implement this i N2?

2) Is there any examples on paging with N2 ? (both frontend & admin edit)

3) Is it possible to do this at all? Without get a very slow site. (9 fields * 500 items is gonna hit the SQL server pretty hard)

Guess this issue could also be related to other things like guestbooks, galleries, etc

Jul 16, 2008 at 11:21 PM
Edited Jul 16, 2008 at 11:32 PM
Hi. interesting question. Esteewhy may have some practical experience in this area ( With some luck he'll join the discussion. Now to your questions:

1. I'd try to group the items into smaller portions, e.g.
  • Dogs (Animals)
    • 2008 (BirthYear)
      • Fluffy (Dog)
This helps performance and manageability with hierarchical-style content.

2. Sorry, can't think of any (mental note to self: make one). I often end up implementing some custom query-string based approach. To page you can use either searching (N2.Find.Items...MaxResults) or filtering (N2.Collections.CountFilter). There's also a data source that can be paged through a datagrid.

3. I don't think 500 are that many. How many users will you be having? Can you use output caching (this voids a lot of performance issues)? By default second-level cache is enabled. This eats a big chunk of the database hits. Don't take my word for this, benchmark yourself. I did:

The benchmark:

I added different numbers of nodes to a single node and displayed name and text1 of the first 50. The nodes each have ~5k data according to your specification.

The numbers show total number of pages in the database at different stages followed by number of nodes beeing tested each time (# of nodes below a single node: seconds for first load -> avg. seconds for subsequent loads)

Total#pages 103
100:    1.4 -> 1.1

Total#pages 503
500:    2.0 -> 1.0

Total#pages 8214
100:    10.3 -> 1.1
500:    10.4 -> 1.1
1000:    10.4 -> 1.2
5000:    11.4 -> 1.7

- Development laptop
- VS integrated development web server
- Local SqLite database
- 2:nd level cache enabled
- no output cache
- based on mvc news/comments example
- 9 values on each item, ~5 kB
- first 50 name + text1 shown, ~160 kB resulting page size

Note that this isn't scientific. You shouldn't look at each number and draw any conclusion since the environment is far from realistic. Anyway there is interesting trend as the numbers grow.