Image Map Image Map
Page 7 of 7 FirstFirst ... 34567
Results 61 to 66 of 66

Thread: Compression? Who needs it.

  1. #61

    Default

    Quote Originally Posted by Trixter View Post
    YouTube only this year fixed 4:3 60p content presentation, so padding was the only way to ensure it worked.
    Not in my experience. I uploaded this 960x720 60fps video in September 2015 and even back then it played fine at 60fps -- as verified by the comment thread posted 3 years ago stating "60 fps?? on old footage like that?? That is very surprising!!!":


  2. #62
    Join Date
    Jan 2014
    Location
    Freedom City
    Posts
    6,363
    Blog Entries
    2

    Default

    Quote Originally Posted by pearce_jj View Post
    Incidentally NTFS performance dives over 80% capacity and really, really dives over 90%. I'm not sure of the reason but it's very measurable if you want to test.
    Oh I'm aware.

  3. #63

    Default

    Quote Originally Posted by pearce_jj View Post
    Incidentally NTFS performance dives over 80% capacity and really, really dives over 90%. I'm not sure of the reason but it's very measurable if you want to test.
    Have you measured this with an SSD? It's not just spinning disks having a slower transfer rate at the end of the disk?
    Looking for a cache card for the "ICL ErgoPRO C4/66d V"

  4. #64
    Join Date
    Mar 2011
    Location
    Norway/Japan
    Posts
    923

    Default

    Quote Originally Posted by pearce_jj View Post
    Incidentally NTFS performance dives over 80% capacity and really, really dives over 90%. I'm not sure of the reason but it's very measurable if you want to test.
    It's because it starts struggling with finding and allocating working space. It's not only NTFS, most filesystems (including way back from the first Unix filesystems) have the same issue - which is why it's a good idea to limit whatever can be limited so that you don't go above 80% on production systems (which we sell, where I work). The only filesystem I've worked with that doesn't suffer as much when nearing full is XFS. But XFS also has a functioning defragger that can run live (with an active filesystem), and can be run in starts and stops as you see fit. Use that one too and you can fill the filesystem to the limit.
    But in general it's good practice (for any filesystem) to leave 20% free, if you can. If you can't, expect slowdowns.

  5. #65

    Default

    Quote Originally Posted by Krille View Post
    Have you measured this with an SSD? It's not just spinning disks having a slower transfer rate at the end of the disk?
    Yes are TOR eludes to, this is a CPU bottleneck mostly. The fall off can be dramatic, like multiple orders of magnitude slower when writing new small files.

  6. #66
    Join Date
    Jan 2014
    Location
    Freedom City
    Posts
    6,363
    Blog Entries
    2

    Default

    I can understand that it takes time to allocate space. What's irritating is that as far as I can tell as simply a user, the entire file is allocated prior to recording. In my book, that part should take a long time. But it does not. Upon allocating the file, the OS shows that it exists, at the full specified size. That part takes very little time.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •