Film-Tech Cinema Systems
Film-Tech Forum ARCHIVE


  
my profile | my password | search | faq & rules | forum home
  next oldest topic   next newest topic
» Film-Tech Forum ARCHIVE   » Operations   » Digital Cinema Forum   » Leave the machine on or off at the end of the night? (Page 1)

 
This topic comprises 2 pages: 1  2 
 
Author Topic: Leave the machine on or off at the end of the night?
Frank Cox
Film God

Posts: 2234
From: Melville Saskatchewan Canada
Registered: Apr 2011


 - posted 04-29-2011 11:30 PM      Profile for Frank Cox   Author's Homepage   Email Frank Cox   Send New Private Message       Edit/Delete Post 
Do you leave your digital projector and server turned on overnight, or do you switch the whole works off at the end of the day?

It's always been my habit to switch the power to everything in my projection room off (all of the power hookups in that room go through a bank of switches on the wall) so with the digital setup I'm rebooting the whole works every day. Alternatively I could just leave the server and/or projector on standby. What do you recommend?

 |  IP: Logged

Kurt Zupin
Jedi Master Film Handler

Posts: 989
From: Maricopa, Arizona
Registered: Oct 2004


 - posted 04-30-2011 01:15 AM      Profile for Kurt Zupin   Email Kurt Zupin   Send New Private Message       Edit/Delete Post 
We shut down the projectors every night but we leave or servers on. They take a while to boot up and never had any issue with leaving them on.

 |  IP: Logged

Tony Bandiera Jr
Film God

Posts: 3067
From: Moreland Idaho
Registered: Apr 2004


 - posted 04-30-2011 11:53 AM      Profile for Tony Bandiera Jr   Email Tony Bandiera Jr   Send New Private Message       Edit/Delete Post 
IF and ONLY IF your servers are on a quality UPS (Tripp-Lite makes a really good one for under $200) it is best to leave the servers on 24/7.

If your servers are not on a UPS, then you are stupid. There I said it and don't care. [Razz] [Big Grin]

As for the projector my instinct says to leave it on "Standby" rather than a hard power-down....but I would like to hear from the Grand Poo-Bah (Brad), Steve G and Sam C since they have more digital install experience than I do.

 |  IP: Logged

Steve Guttag
We forgot the crackers Gromit!!!

Posts: 12814
From: Annapolis, MD
Registered: Dec 1999


 - posted 04-30-2011 02:15 PM      Profile for Steve Guttag   Email Steve Guttag   Send New Private Message       Edit/Delete Post 
My opinion on the matter is...

Anything that moves air is also a "dirt-pump" too. So how clean the room is will greatly affect how filthy the equipment gets from the act of just leaving it on. This is particularly true of the projectors. I recently cleaned a projector that was in operation only 6-months. And despite all of the filtering of the projector, I was able to improve the light by over 6.4% by cleaning JUST the Dolby filter wheel! If you add up the cleaning of the other glass/mirror items...the light improvement was into the double-digits. The booth was not particularly filthy at all, either. The back of the lens proved to be a BIG dust/dirt collector, just like with film. These machines were turned off each night. One could expect twice as much dirt had they been left on the extra 12-hours just to make noise a night.

On the server side of things, you don't have an optical element BUT I have found servers absolutely clogged full of dirt too...which compromises the ability to remove the heat, which will lead to failure. On stupid thing most companies do is have their air flow to be from the Front to the Back of the equipment that is racked...which is dumb. The air INSIDE of the rack is controllable. QSC got this right, they pump from rear to front. Thus, one can put filters on INTAKE blowers to pressurize the rack (or pedestal) and let the equipment suck on nice filtered (and cool) air, then spit it out as warmer air for the booth's HVAC to deal with (heat load). The other drawback to the preset designs is that each piece of equipment blows its high air into the rack to help heat up the other equipment...just plain dumb.

That is the dirt side to the equation. Then there is the operational side. I've found that it is best to have at least a regular schedule of rebooting the equipment. It really seems to clear out problems before they start showing themselves. A reboot will often cause the equipment to purge memory, check the time/date and start things up fresh. So, even if you don't want to turn things off every night, I suggest you reboot everything at least once a week.

At present, all of my customers turn the DCinema equipment off nightly that are NOT part of a NOC. Most NOCs will require it to remain on 24-hours so they can monitor it and possibly handle updates during the down-hours. As such, I recommend having a regular schedule where you merely reboot the equipment periodically...once a week is sufficient and in fact, once a month would probably be okay too but harder to keep track.

-Steve

 |  IP: Logged

Mike Blakesley
Film God

Posts: 12767
From: Forsyth, Montana
Registered: Jun 99


 - posted 05-01-2011 12:59 PM      Profile for Mike Blakesley   Author's Homepage   Email Mike Blakesley   Send New Private Message       Edit/Delete Post 
I've wondered about this too. On the one hand, I've heard that the hardest thing on a computer is the boot-up. And, one of the first things to go is the switches. But I can see the dirt-accumulation issues too. We only have a max of two shows a day usually, so it seems to make more sense to turn it off, but still... there probably is no hard and fast good answer to this one, I guess. You're going to compromise something either way.

 |  IP: Logged

Steve Guttag
We forgot the crackers Gromit!!!

Posts: 12814
From: Annapolis, MD
Registered: Dec 1999


 - posted 05-01-2011 04:16 PM      Profile for Steve Guttag   Email Steve Guttag   Send New Private Message       Edit/Delete Post 
With computers, I generally run 24-hours...I have had HDD get "sticktion" from being off. Even with computers, I find I have to reboot them, periodically to clear out stuff and get them running back at peak. There is often a way to do a software reboot too that will accomplish most of the same thing as a hardware power-cycle.
-Steve

 |  IP: Logged

Scott Norwood
Film God

Posts: 8146
From: Boston, MA. USA (1774.21 miles northeast of Dallas)
Registered: Jun 99


 - posted 05-01-2011 04:58 PM      Profile for Scott Norwood   Author's Homepage   Email Scott Norwood   Send New Private Message       Edit/Delete Post 
I'm not saying that Steve is wrong, but I would argue that any piece of hardware or software that requires regular reboots to avoid issues is broken as designed and that the memory leaks (or other issues) need to be fixed by the manufacturer, not through a silly workaround by the end-user.

I have had various servers and pieces of network hardware (not in the cinema industry) that had uptimes of 1000+ days without issues. Eventually, hardware or OS upgrades and/or power failures forced reboots, though.

(So, yes, computers in general--especially those with spinning disks (as opposed to SSDs)--are best kept running 24/7. I am not qualified to say if this applies to D-cinema hardware or not, however.)

 |  IP: Logged

Terrence Meiczinger
Film Handler

Posts: 45
From: Orono, Me, USA
Registered: Dec 2008


 - posted 05-01-2011 10:17 PM      Profile for Terrence Meiczinger   Author's Homepage   Email Terrence Meiczinger   Send New Private Message       Edit/Delete Post 
Power consumption is something to consider as well. If you leave everything on in a rack and it is using 1500w (just a guess) @ $.15/kwh (New England) thats gonna be $1000 per rack a year. That adds up for larger theaters. There would have to be real statistical evidence that power cycling reduces the equipment's life expectancy by a significant amount and increases the failure rate.

I've done power cycling and thermal tests for enterprise network routers. We'd power cycle them 100,000+ times and vary environmental temperatures. The failure rates were not significantly different from units in the field under normal operating conditions.

 |  IP: Logged

Mike Blakesley
Film God

Posts: 12767
From: Forsyth, Montana
Registered: Jun 99


 - posted 05-01-2011 10:59 PM      Profile for Mike Blakesley   Author's Homepage   Email Mike Blakesley   Send New Private Message       Edit/Delete Post 
quote: Scott Norwood
I'm not saying that Steve is wrong, but I would argue that any piece of hardware or software that requires regular reboots to avoid issues is broken as designed
I think a lot of "reboot regularly" recommendations are to help avoid service calls. I can't even imagine how many malfunctioning electronic gadgets I've seen that were brought back to life with a simple power cycle.

 |  IP: Logged

Steve Guttag
We forgot the crackers Gromit!!!

Posts: 12814
From: Annapolis, MD
Registered: Dec 1999


 - posted 05-01-2011 11:38 PM      Profile for Steve Guttag   Email Steve Guttag   Send New Private Message       Edit/Delete Post 
I'm not disagreeing with you Scott. I'm just saying from practice, a reboot seems to clear up issues. I can't control what the software hacks do.

-Steve

 |  IP: Logged

Mark Gulbrandsen
Resident Trollmaster

Posts: 16657
From: Music City
Registered: Jun 99


 - posted 05-02-2011 10:32 AM      Profile for Mark Gulbrandsen   Email Mark Gulbrandsen   Send New Private Message       Edit/Delete Post 
quote: Tony Bandiera Jr
If your servers are not on a UPS, then you are stupid. There I said it and don't care.


Right on Tony [thumbsup] ! Ditto for the projector's electronics section and the local network switch. You should however have at least a 1KVA UPS or larger. I typically use a 1.6kva Tripp Lite that runs about $800.00 on line. It'll run the server, switch and projector electronics for about 15 min. before it shuts down. Give plenty of time to go around a large multiplex and do proper power downs on the gear. ALso filters out all the nasty stuff that might come in on the AC line.

Leaving a projector in standby and your server powered on is actually what you must do if you are monitored for VPF's or hooked to a contracted NOC. I have spent hours cleaning grease and dirt from digital projector's lamphouses that have auto align for the lamp or no filtering on the lamphouse air intake. So having a projector whose entire air intake is filtered is a big advantage. The projector will remain more or less spotless inside but you will for sure need to change filters AT the perscribed intervals. Ditto for servers... if your server has air intake filters as the GDC does then leave it on if you're running 10 to 12 hours a day and put your projector in standby mode. Change the filters at the prescribed interval. Installing your server on rack slides makes changing filters an easy thing to do.

If you run just a few hours a day then I reccomend shutting it all down for the night as you are just wasting power. All this gear with the exception of Qube(DATASAT) runs on Linux and a daily boot up or boot down isn't going to hurt anything software related. As for the drives... failures are rare but can happen weather you leave them on or off. The MTBF point in the HD's life span will be reached much quicker if it is left on all the time. So IMHO you basically have the same chance for a drive failure leaving it running or powering it down. SATA drives are cheap and you can have spares all ready to go in the server or if your server allows you can play back directly off the DCP anyway. A SSD can always be installed if the OS drive fails since it is a smaller size drive and they are way more reliable. SSD's are just not cost effective for a large raid 5 at this point in time.

As for having to reboot to clear issues... preojectors and servers are getting better and the software that runs them is more refined. Reboots are not a thing of the past but are generally required far less frequent then in the early days of digital.

Mark

 |  IP: Logged

Scott Norwood
Film God

Posts: 8146
From: Boston, MA. USA (1774.21 miles northeast of Dallas)
Registered: Jun 99


 - posted 05-02-2011 12:14 PM      Profile for Scott Norwood   Author's Homepage   Email Scott Norwood   Send New Private Message       Edit/Delete Post 
As this article indicates, MTBF is a load of crap as applied to hard disks. In real life, they fail on a bathtub-shaped curve: most will either fail early or will last for a number of years. The key is to keep data on redundant disk arrays, keep spare disks on the shelf, and keep good backups of any valuable data.

And I agree completely that a good UPS is important for any sort of critical computer or network hardware.

 |  IP: Logged

Mark Gulbrandsen
Resident Trollmaster

Posts: 16657
From: Music City
Registered: Jun 99


 - posted 05-02-2011 02:10 PM      Profile for Mark Gulbrandsen   Email Mark Gulbrandsen   Send New Private Message       Edit/Delete Post 
Scott,
Thats hilarious that Seagate claims 1.5 million hours for a Cheeta SATA Drive, SCSIdroves would typically last longer though because the build quality is better. More like 150,000 hours is realistic for any SATA drive. I think the manufacturers MTBF is always averaged with disks always left running. Of course they are going to want to show the best spec they can. The bathtub shape comes into play when those testing the drives also take into account that some are turned off and on on a daily or more often basis. From my experience with SATA drives leaving them on or off is meaningless as to their actual life span. It's going to vary all over the place no matter what. With SCSI drives leaving them always running guarantees you a much longer life span and they are just flat out better built. I have some 146 gb scsi drives here that are in excess of 8 years old and they still run quietly and perfectly normal. Simple math says they have about 75,000 hours on them after running for 8 continous years. The main thing is not to subject any running drives to any vibration at all, vibration is what will kill them really fast. No arguement on running RAID levels for redundancy except a number of the common servers out there do not make that an option for the OS drive. So in that case a SSD would be a wiser choice as they are uber reliable and for read only which they would be doing in the case of an OS drive it would also reduce boot up time dramatically.

Mark

 |  IP: Logged

Bill Enos
Film God

Posts: 2081
From: Richmond, Virginia, USA
Registered: Apr 2000


 - posted 05-02-2011 07:32 PM      Profile for Bill Enos   Email Bill Enos   Send New Private Message       Edit/Delete Post 
And what is the absolutely first thing the service tech does---turns it off and back on, even if smoke is billowing
from it.

 |  IP: Logged

Olivier Lemaire
Expert Film Handler

Posts: 118
From: Paris, Ile de France, France
Registered: Jan 2010


 - posted 05-02-2011 11:33 PM      Profile for Olivier Lemaire   Author's Homepage   Email Olivier Lemaire   Send New Private Message       Edit/Delete Post 
Out there (EMEA), most of the servers I can see (roughtly 920 deployed at 20110503) run 24x7, of course secured (UPS and auto-extinction procedure in case of UPS low battery).

This have advantages: we can upgrade freely, and so keep a running fleet pretty up-to-date.
This have drawback: you sometimes have to call the operator for him to reboot it's Doremi unit to have a firmware patch applied as it's unit is running 155 days uptime...

 |  IP: Logged



All times are Central (GMT -6:00)
This topic comprises 2 pages: 1  2 
 
   Close Topic    Move Topic    Delete Topic    next oldest topic   next newest topic
 - Printer-friendly view of this topic
Hop To:



Powered by Infopop Corporation
UBB.classicTM 6.3.1.2

The Film-Tech Forums are designed for various members related to the cinema industry to express their opinions, viewpoints and testimonials on various products, services and events based upon speculation, personal knowledge and factual information through use, therefore all views represented here allow no liability upon the publishers of this web site and the owners of said views assume no liability for any ill will resulting from these postings. The posts made here are for educational as well as entertainment purposes and as such anyone viewing this portion of the website must accept these views as statements of the author of that opinion and agrees to release the authors from any and all liability.

© 1999-2020 Film-Tech Cinema Systems, LLC. All rights reserved.