SageTV Community  

Go Back   SageTV Community > SageTV Development and Customizations > SageTV Studio

Notices

SageTV Studio Discussion related to the SageTV Studio application produced by SageTV. Questions, issues, problems, suggestions, etc. relating to the Studio software application should be posted here.

Reply
 
Thread Tools Search this Thread Display Modes
  #1  
Old 10-20-2009, 08:04 PM
Slugger Slugger is offline
SageTVaholic
 
Join Date: Mar 2007
Location: Kingston, ON
Posts: 4,008
Moving files while service is running using API calls?

My new NAS is just about ready to go and so now I'm looking at how I can use it for archiving. Basically, the key is I want to move recordings programatically and while SageTV is running. I've read the "moving files" FAQ and all methods described require shutdown of the SageTV service. However, looking at the current SageTV 6.6 API docs, it seems to me I should be able to move files programatically?

Here's what I want to try and do:

Basically this NAS I've built will have its hard drives exposed to the SageTV server as network drives. These drives will be added to SageTV as import directories only (i.e. no plans to record directly to the NAS).

My SJQ plugin knows how to identify the recordings I want to move off to the NAS and I just want to move the recording and tell SageTV about its new location.

If I'm reading the API docs right, I should be able to do this live without the need to stop Sage by running the following program (pseudo code, Sage API calls in bold):

Code:
var $origFile = /path/of/orig/recording;
var $newFile = /path/to/new/location;

if(file copy of $origFile to NAS drive as $newFile succeeds) {
   var $airing = MediaFileAPI.GetMediaFileAiring($origFile);
   // Must delete the orig file first since SetMediaFileAiring() will fail if the airing is already linked to a media file
   if(MediaFileAPI.DeleteFile($origFile) == true) {
      var $mf = MediaFileAPI.AddMediaFile($newFile, null);
      if(!MediaFileAPI.SetMediaFileAiring($mf, $airing)) {
         // Ooops, not good, will have to figure out something should this call actually fail at this point
      }
   } else {
      // Handle error
   }
} else {
   // Handle error
}
Is there some reason why this shouldn't work?
__________________
Twitter: @ddb_db
Server: Intel i5-4570 Quad Core, 16GB RAM, 1 x 128GB OS SSD (Win7 Pro x64 SP1), 1 x 2TB media drive
Capture: 2 x Colossus
STB Controller: 1 x USB-UIRT
Software:Java 1.7.0_71; SageTV 7.1.9
Clients: 1 x HD300, 2 x HD200, 1 x SageClient, 1 x PlaceShifter
Plugins: Too many to list now...
Reply With Quote
  #2  
Old 10-21-2009, 01:23 AM
GKusnick's Avatar
GKusnick GKusnick is offline
SageTVaholic
 
Join Date: Dec 2005
Posts: 5,083
I seem to recall trying something like this once upon a time, but I don't recall what the outcome was (other than that I obviously didn't pursue it past the experimental stage).

I say give it a try and see what happens. A few lines of Studio code or a few rounds with Expression Evaluator should tell you pretty quickly whether it's going to work.

One thing that makes me somewhat queasy about it is that it's not atomic. A crash between DeleteFile and AddMediaFile could leave you in a bad state that you'd have to patch up by hand. To make it really robust, you'd probably want to write out some sort of intentions file at the start of the operation, with sufficient information to detect various failure modes and recover from them automatically on restart.
__________________
-- Greg
Reply With Quote
  #3  
Old 10-21-2009, 07:01 AM
Slugger Slugger is offline
SageTVaholic
 
Join Date: Mar 2007
Location: Kingston, ON
Posts: 4,008
Yeah, it not being an atomic operation is the real concern, but I've tried to figure out how the Transcoder does it, and as far as I can tell (based on the public API), it must be doing something similar. Of course, the transcoder might have some recovery process built into it in case of a failure along the way, which is something I will definitely want to address.

I will experiment once I bring the NAS online, but since I don't have a separate test env, I'm going to have to experiment with this on my lone Sage server and was hoping Opus4 or Narflex might see this before I begin experimenting and "talk me off the ledge" so to speak in case this has zero chance of working for whatever reason.
__________________
Twitter: @ddb_db
Server: Intel i5-4570 Quad Core, 16GB RAM, 1 x 128GB OS SSD (Win7 Pro x64 SP1), 1 x 2TB media drive
Capture: 2 x Colossus
STB Controller: 1 x USB-UIRT
Software:Java 1.7.0_71; SageTV 7.1.9
Clients: 1 x HD300, 2 x HD200, 1 x SageClient, 1 x PlaceShifter
Plugins: Too many to list now...
Reply With Quote
  #4  
Old 10-21-2009, 10:46 AM
tmiranda's Avatar
tmiranda tmiranda is offline
SageTVaholic
 
Join Date: Jul 2005
Location: Central Florida, USA
Posts: 5,842
AFAIK you can just use SJQ to move the files from the recording directory to the import directory and then run a "refresh media" from within Sage. If you do that the recordings will still appear as "recordings" and not "imported". Sage will not record to the import directory even if there are "recordings" in it.

I do this often, manually through Explorer, and it works fine. No need to shutdown at all.
__________________

Sage Server: 8th gen Intel based system w/32GB RAM running Ubuntu Linux, HDHomeRun Prime with cable card for recording. Runs headless. Accessed via RD when necessary. Four HD-300 Extenders.
Reply With Quote
  #5  
Old 10-21-2009, 10:53 AM
stuckless's Avatar
stuckless stuckless is offline
SageTVaholic
 
Join Date: Oct 2007
Location: London, Ontario, Canada
Posts: 9,585
I think that will probably work. It's similar (not not exactly the same) as what I do in bmt. Actually I started a phoenix "move" api, but i haven't finished it.

I'd consider doing the delete as the last operation, so that your can make it a little easier to make it an atomic operation. ie, Copy, Assign Airing, and if that works, then delete the original.
Reply With Quote
  #6  
Old 10-21-2009, 12:39 PM
Slugger Slugger is offline
SageTVaholic
 
Join Date: Mar 2007
Location: Kingston, ON
Posts: 4,008
Quote:
Originally Posted by stuckless View Post
I think that will probably work. It's similar (not not exactly the same) as what I do in bmt. Actually I started a phoenix "move" api, but i haven't finished it.

I'd consider doing the delete as the last operation, so that your can make it a little easier to make it an atomic operation. ie, Copy, Assign Airing, and if that works, then delete the original.
API docs say the assign airing will fail if the airing is already linked to another media file, which is why I have to delete the old one first since there's no API call to unlink an airing from a media file.
__________________
Twitter: @ddb_db
Server: Intel i5-4570 Quad Core, 16GB RAM, 1 x 128GB OS SSD (Win7 Pro x64 SP1), 1 x 2TB media drive
Capture: 2 x Colossus
STB Controller: 1 x USB-UIRT
Software:Java 1.7.0_71; SageTV 7.1.9
Clients: 1 x HD300, 2 x HD200, 1 x SageClient, 1 x PlaceShifter
Plugins: Too many to list now...
Reply With Quote
  #7  
Old 10-23-2009, 08:47 AM
stuckless's Avatar
stuckless stuckless is offline
SageTVaholic
 
Join Date: Oct 2007
Location: London, Ontario, Canada
Posts: 9,585
Quote:
Originally Posted by Slugger View Post
API docs say the assign airing will fail if the airing is already linked to another media file, which is why I have to delete the old one first since there's no API call to unlink an airing from a media file.
Well, I guess that complicates the rollback process
Reply With Quote
  #8  
Old 10-23-2009, 09:16 AM
Slugger Slugger is offline
SageTVaholic
 
Join Date: Mar 2007
Location: Kingston, ON
Posts: 4,008
Just finished my first test of this using my shiny new unRAID server.

Everything works fine as far as copying the recording, deleting the old one and relinking the new file to the airing info. Just have to remember to reset the time stamp of the file to avoid timeline issues during playback, otherwise everything worked smoothly.

Even though you have to delete the old media file before relinking, a roll back is still possible, except the rollback involves copying the file back to its original location. However, once the file copy is successful and verified then the rest of the operations (delete orig MediaFile obj, add new MediaFile obj, link new MediaFile obj to Airing obj) are so quick and (should be, knock on wood) so unlikely to fail that I'd feel rather confident as soon as the file copy was verified. As long as I keep track of where the recording was copied to, the new media file id that was created and what airing id it's suppose to be linked to then even if something failed then the recovery process should just be a matter of creating the media file obj in wiz.bin and/or relinking it to the airing obj in wiz.bin.

The tests I've been doing have been manual through Studio Expression Evaluator, so now I'm going to write the API program to fully automate this (probably as an internal task in SJQ). The key is proper roll back and I think that as soon as the file copy is successful and verified then it's not so much a rollback that's needed as the ability to retry the relinking until it succeeds.
__________________
Twitter: @ddb_db
Server: Intel i5-4570 Quad Core, 16GB RAM, 1 x 128GB OS SSD (Win7 Pro x64 SP1), 1 x 2TB media drive
Capture: 2 x Colossus
STB Controller: 1 x USB-UIRT
Software:Java 1.7.0_71; SageTV 7.1.9
Clients: 1 x HD300, 2 x HD200, 1 x SageClient, 1 x PlaceShifter
Plugins: Too many to list now...
Reply With Quote
  #9  
Old 01-02-2010, 04:29 PM
BACON's Avatar
BACON BACON is offline
Sage User
 
Join Date: Jan 2010
Location: Chicago, IL
Posts: 15
Question

Slugger, did you have any success with this? What you described is exactly the functionality I was hoping to have, myself: record on a hard drive local to the SageTV server, then have those files eventually and automatically moved (archived) to network storage.

When I found Sage Job Queue I thought that might be able to handle this, but I did have some conditions and prioritizations I wanted to incorporate as far as what gets archived and when rather than just archiving everything, and it didn't look like Sage Job Queue's filter syntax would be able to accomplish that. In general, the rules I wanted were something like:
  • Don't perform archiving if content is being watched or recorded, or a recording is scheduled within the next X minutes
  • Only archive recordings older than Y days (unless local disk space is below a certain threshold, in which case everything becomes eligible)
  • Archive recordings with the Archive flag set before those with it unset
  • Archive recordings with the Watched flag unset before those with it set
  • Don't archive recordings that come from a favorite whose "Keep At Most" property is non-zero

It's the two bold rules that I'm not sure how to express in Sage Job Queue. Basically, I'm thinking of something that, say, 1-3 times an hour would attempt to archive the next eligible recording by taking the list of recordings, sorting and filtering it by the above rules, and archiving the recording at the top of the list (if I understand Sage Job Queue correctly, it would enqueue all eligible recordings at once, as opposed to just the one with the highest priority and then reevaluating the list on the next iteration). So, I figured I might need a plugin to accomplish what I want (which I'm having trouble getting off the ground with), but if you've come up with something similar then I'd be curious to learn more about that, too.

Also, as far as having to delete the original recording before linking the airing to the copy, can you not just rename the original recording so that Sage is unable to find it? That way, if something fails your rollback can just rename it back to its original name rather than copying the file back down from the network.
Reply With Quote
  #10  
Old 01-03-2010, 10:14 AM
Slugger Slugger is offline
SageTVaholic
 
Join Date: Mar 2007
Location: Kingston, ON
Posts: 4,008
Quote:
Originally Posted by BACON View Post
Slugger, did you have any success with this? What you described is exactly the functionality I was hoping to have, myself: record on a hard drive local to the SageTV server, then have those files eventually and automatically moved (archived) to network storage.
Yes, I'm using this daily and have successfully archived some 1.5TB of recordings over the last couple months with no issues.

Quote:
When I found Sage Job Queue I thought that might be able to handle this, but I did have some conditions and prioritizations I wanted to incorporate as far as what gets archived and when rather than just archiving everything, and it didn't look like Sage Job Queue's filter syntax would be able to accomplish that. In general, the rules I wanted were something like:
  • Don't perform archiving if content is being watched or recorded, or a recording is scheduled within the next X minutes
  • Only archive recordings older than Y days (unless local disk space is below a certain threshold, in which case everything becomes eligible)
  • Archive recordings with the Archive flag set before those with it unset
  • Archive recordings with the Watched flag unset before those with it set
  • Don't archive recordings that come from a favorite whose "Keep At Most" property is non-zero

It's the two bold rules that I'm not sure how to express in Sage Job Queue. Basically, I'm thinking of something that, say, 1-3 times an hour would attempt to archive the next eligible recording by taking the list of recordings, sorting and filtering it by the above rules, and archiving the recording at the top of the list (if I understand Sage Job Queue correctly, it would enqueue all eligible recordings at once, as opposed to just the one with the highest priority and then reevaluating the list on the next iteration). So, I figured I might need a plugin to accomplish what I want (which I'm having trouble getting off the ground with), but if you've come up with something similar then I'd be curious to learn more about that, too.
The two in bold aren't possible in SJQ. You could tell SJQ to archive one before the other, but SJQ's still going to load both into the queue and archive both; all you can do is set the priority of one over the other so that one runs before the other, but they're still going to both run. SJQ has no way to say "only load one task into the queue then stop processing". SJQ always looks at every object and decides if it should be added into the queue. The last one isn't possible because the fav object isn't exposed, though that could be changed (i.e. if an issue ticket were added).

Quote:
Also, as far as having to delete the original recording before linking the airing to the copy, can you not just rename the original recording so that Sage is unable to find it? That way, if something fails your rollback can just rename it back to its original name rather than copying the file back down from the network.
Yeah, this is eventually what I ended up doing. First thing I do is rename the original and a rollback just renames it back if necessary.
__________________
Twitter: @ddb_db
Server: Intel i5-4570 Quad Core, 16GB RAM, 1 x 128GB OS SSD (Win7 Pro x64 SP1), 1 x 2TB media drive
Capture: 2 x Colossus
STB Controller: 1 x USB-UIRT
Software:Java 1.7.0_71; SageTV 7.1.9
Clients: 1 x HD300, 2 x HD200, 1 x SageClient, 1 x PlaceShifter
Plugins: Too many to list now...
Reply With Quote
  #11  
Old 01-21-2010, 10:03 PM
Islander Islander is offline
Sage Advanced User
 
Join Date: Oct 2009
Location: Cayman Islands
Posts: 163
Slugger would you mind sharing the code for this?

In the past few days I have been considering a similar setup. My data server is an HP485 currently populated with some 10tb between rips of my dvd and bd collection and recorded shows. None of the drives were formatted with 64k clusters and although I believe it is theoretically possible to use software to reformat them without losing the data or (shivers) do this manually by "removing" each drive and then reformatting them, the first option is a risk with so much data involved and the second will take some time.

I am using an Atom 330 as the Sagetv server and I think not having 64k cluster is just adding to the performance woes the 330 has even when everything is connected at gigabyte speeds. Recording to a local drive formatted with 64k clusters should alleviate some of the issues whilst I decide what is the best upgrade route for a new sagetv server.

So that I understand. Whilst the recordings are sitting in the local recording drive they are available in the "default" recordings screen same as they will be once they are moved to the "imported" directory residing on the data server, right?

Thanks in advance.
Reply With Quote
  #12  
Old 01-24-2010, 10:20 AM
Slugger Slugger is offline
SageTVaholic
 
Join Date: Mar 2007
Location: Kingston, ON
Posts: 4,008
Do you want the actual source code implementation of the move task? If so, all the source code is available on the project site. The code for the move task is specifically located here:

http://code.google.com/p/sagetv-addo...vArchiver.java

If you want an example of how to setup SJQ to do this kind of move thing then follow the SJQv3 User's Guide for installing and setting up SJQ, read through the ruleset examples there then use this ruleset as a guideline for setting up the move task:

Code:
# Move syndicated shows, movies, plus anything else marked for archival to the NAS
if [IsObjMediaFile == true && $.IsTVFile{} == true && (TimeUntilNextRecording >= "6H" || FreeSpacePercentage < 7) && $.IsFileCurrentlyRecording{} == false && (((($.GetShowCategory{} == "Movie" && $.GetShowSubCategory{} != "Documentary") || $.GetShowTitle{} =% "Friends|Seinfeld|Night Court|Three's Company|Family Guy|Married \\.\\.\\. With Children|The Fresh Prince of Bel-Air") && $.IsLibraryFile{} == false) || ($.IsLibraryFile{} == true && Filename !^ "\\\\nas"))] {
   :MOVERECOPTS "\\\\nas\\tv"
   :FAILALERT "true"
   _MOVEREC
}
Further discussion of this should probably be moved to the SJQv3 support thread. I subscribe to that thread and will definitely read any messages in that thread, I may or may not see further replies in this thread.
__________________
Twitter: @ddb_db
Server: Intel i5-4570 Quad Core, 16GB RAM, 1 x 128GB OS SSD (Win7 Pro x64 SP1), 1 x 2TB media drive
Capture: 2 x Colossus
STB Controller: 1 x USB-UIRT
Software:Java 1.7.0_71; SageTV 7.1.9
Clients: 1 x HD300, 2 x HD200, 1 x SageClient, 1 x PlaceShifter
Plugins: Too many to list now...

Last edited by Slugger; 01-24-2010 at 10:24 AM.
Reply With Quote
  #13  
Old 01-24-2010, 11:40 AM
Islander Islander is offline
Sage Advanced User
 
Join Date: Oct 2009
Location: Cayman Islands
Posts: 163
Thanks, that is great!
Reply With Quote
Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
How can you tell if the SageTV Service is running BobPhoenix SageTV Studio 2 04-15-2008 10:11 PM
When running as service why does starting Sage UI say SageTV Service is initializing? GollyJer SageTV Software 2 12-27-2006 09:59 AM
Not Connecting When Running as Service? RAlfieri SageTV Beta Test Software 3 02-28-2006 06:38 PM
Running Sage as a Service Cugrz SageTV Software 6 02-15-2005 02:52 PM


All times are GMT -6. The time now is 05:52 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.
Copyright 2003-2005 SageTV, LLC. All rights reserved.