Class SFTPHelper

java.lang.Object
com.logicaldoc.importfolder.CrawlerHelper
com.logicaldoc.importfolder.sftp.SFTPHelper

public class SFTPHelper extends CrawlerHelper
Helper for SFTP folders
Since:
8.0
Author:
Marco Meschieri - LogicalDOC
  • Constructor Details

    • SFTPHelper

      public SFTPHelper(ImportFolder importFolder, ImportFolderCrawler crawler) throws com.logicaldoc.core.PersistenceException
      Throws:
      com.logicaldoc.core.PersistenceException
  • Method Details

    • checkinFile

      public void checkinFile(Object file, long docId, com.logicaldoc.core.security.user.User owner) throws IOException, com.logicaldoc.core.PersistenceException
      Description copied from class: CrawlerHelper
      Imports a remote file into an existing document producing a new version
      Specified by:
      checkinFile in class CrawlerHelper
      Parameters:
      file - The file to be loaded
      docId - The document identifier
      owner - The owner user
      Throws:
      IOException - I/O error
      com.logicaldoc.core.PersistenceException - Error in the database layer
    • importFile

      public com.logicaldoc.core.document.Document importFile(Object file, com.logicaldoc.core.folder.Folder folder, com.logicaldoc.core.security.user.User owner, boolean timestamp) throws IOException, com.logicaldoc.core.PersistenceException
      Description copied from class: CrawlerHelper
      Imports a remote file
      Specified by:
      importFile in class CrawlerHelper
      Parameters:
      file - The file to be loaded
      folder - The target folder
      owner - The owner user
      timestamp - If the timestamp has to be included in filename and title
      Returns:
      The newly created document
      Throws:
      IOException - I/O exception
      com.logicaldoc.core.PersistenceException - Error in the data layer
    • list

      public void list(Object parent, int depth, List<Object> folders, List<Object> files, long max, ImportFolderCache cache) throws IOException
      Description copied from class: CrawlerHelper
      Lists all files in a remote folder
      Specified by:
      list in class CrawlerHelper
      Parameters:
      parent - The parent directory
      depth - The maximum depth
      folders - The list that will contain all allowed folders
      files - The list that will contain all allowed files
      max - The maximum number of elements in files
      cache - Cache of imported documents
      Throws:
      IOException - I/O error
    • getName

      public String getName(Object file)
      Description copied from class: CrawlerHelper
      Computes the name of the remote file
      Specified by:
      getName in class CrawlerHelper
      Parameters:
      file - The file to be considered
      Returns:
      The name
    • getPath

      public String getPath(Object file)
      Description copied from class: CrawlerHelper
      Computes the path of the remote file
      Specified by:
      getPath in class CrawlerHelper
      Parameters:
      file - The file to be considered
      Returns:
      The path
    • getLastModified

      public Date getLastModified(Object file)
      Description copied from class: CrawlerHelper
      Computes remote file last modification time
      Specified by:
      getLastModified in class CrawlerHelper
      Parameters:
      file - The file to be considered
      Returns:
      The last modification time
    • getCreationDate

      public Date getCreationDate(Object file)
      Description copied from class: CrawlerHelper
      Computes remote file creation date
      Specified by:
      getCreationDate in class CrawlerHelper
      Parameters:
      file - The file to be considered
      Returns:
      The creation date
    • importUsingIndex

      public void importUsingIndex(ImportFolderCrawler crawler) throws IOException, com.logicaldoc.core.PersistenceException, ParserConfigurationException, SAXException
      Description copied from class: CrawlerHelper
      Imports documents referenced by a index file
      Specified by:
      importUsingIndex in class CrawlerHelper
      Parameters:
      crawler - The current ImportFolderCrawler task
      Throws:
      IOException - I/O error
      com.logicaldoc.core.PersistenceException - Error in the database layer
      ParserConfigurationException - XML error
      SAXException - XML error
    • importDocumentsCount

      public int importDocumentsCount() throws IOException, com.logicaldoc.core.PersistenceException, ParserConfigurationException, SAXException
      Description copied from class: CrawlerHelper
      The documents, referenced by a index file, that should be imported. It returns -1 in case there is not index file
      Specified by:
      importDocumentsCount in class CrawlerHelper
      Returns:
      number of documents references inside the index file
      Throws:
      IOException - I/O error
      com.logicaldoc.core.PersistenceException - Error in the data layer
      ParserConfigurationException - XML error
      SAXException - XML error
    • deleteFile

      public boolean deleteFile(Object file)
      Description copied from class: CrawlerHelper
      Delete the given referenced file after document import.
      Specified by:
      deleteFile in class CrawlerHelper
      Parameters:
      file - The file that must be deleted
      Returns:
      if the file has been successfully deleted
    • testConnection

      public boolean testConnection()
      Description copied from class: CrawlerHelper
      Tests if the import folder can be accessed
      Specified by:
      testConnection in class CrawlerHelper
      Returns:
      true if the import folder is accessible
    • getFile

      public Object getFile(String path)
      Description copied from class: CrawlerHelper
      Gets the object representation of the given path
      Specified by:
      getFile in class CrawlerHelper
      Parameters:
      path - the full path of a remote file
      Returns:
      the remote file at the specified path
    • getContentFile

      public File getContentFile(String path)
      Description copied from class: CrawlerHelper
      Retrieves a file that contains the content referenced by path
      Specified by:
      getContentFile in class CrawlerHelper
      Parameters:
      path - the full path of a remote file
      Returns:
      a local file containing the remote file's content