I have a main TCL proc that sources tons of other tcl procs in other folders and subsequent subdirectories. For example, in the main proc it has:
我有一个主要的TCL过程,在其他文件夹和后续子目录中获取大量其他tcl过程。例如,在主程序中它具有:
source $basepath/folderA/1A.tcl
source $basepath/folderA/2A.tcl
source $basepath/folderA/3A.tcl
source $basepath/folderB/1B.tcl
source $basepath/folderB/2B.tcl
source $basepath/folderB/3B.tcl
and it seems kind of stupid to do it that way when I always know I will source everything in folderA and folderB. Is there a function (or simple way) that'll allow me to just source all the .tcl files in an entire folder?
当我总是知道我将在folderA和folderB中获取所有内容时,这样做似乎有点愚蠢。是否有一个函数(或简单的方法)允许我只是在整个文件夹中获取所有.tcl文件?
7 个解决方案
#1
Building on ramanman's reply, heres a routine that tackles the problem using the built in TCL file commands and which works it way down the directory tree recursively.
在ramanman的回复基础上,继承了一个例程,该例程使用内置的TCL文件命令解决问题,并以递归方式沿着目录树工作。
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles { basedir pattern } {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set basedir [string trimright [file join [file normalize $basedir] { }]]
set fileList {}
# Look in the current directory for matching files, -type {f r}
# means ony readable normal files are looked at, -nocomplain stops
# an error being thrown if the returned list is empty
foreach fileName [glob -nocomplain -type {f r} -path $basedir $pattern] {
lappend fileList $fileName
}
# Now look for any sub direcories in the current directory
foreach dirName [glob -nocomplain -type {d r} -path $basedir *] {
# Recusively call the routine on the sub directory and append any
# new files to the results
set subDirList [findFiles $dirName $pattern]
if { [llength $subDirList] > 0 } {
foreach subDirFile $subDirList {
lappend fileList $subDirFile
}
}
}
return $fileList
}
#2
It gets trivial with tcllib on board:
在船上使用tcllib会变得微不足道:
package require fileutil
foreach file [fileutil::findByPattern $basepath *.tcl] {
source $file
}
#3
Perhaps a little more platform independent and using builtins commands instead of piping to a process:
也许更多的平*立并使用内置命令而不是管道进程:
foreach script [glob [file join $basepath folderA *.tcl]] {
source $script
}
Repeat for folderB.
对folderB重复一遍。
If you have more stringent selection criteria, and don't worry about running on any other platforms, using find may be more flexible.
如果您有更严格的选择标准,并且不担心在任何其他平台上运行,使用find可能会更灵活。
#4
Based on a previous answer, this version handles cycles created by symbolic links and in the process eliminates duplicate files due to symbolic links as well.
根据之前的答案,此版本处理由符号链接创建的循环,并在此过程中消除由于符号链接导致的重复文件。
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles {directory pattern} {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set directory [string trimright [file join [file normalize $directory] { }]]
# Starting with the passed in directory, do a breadth first search for
# subdirectories. Avoid cycles by normalizing all file paths and checking
# for duplicates at each level.
set directories [list]
set parents $directory
while {[llength $parents] > 0} {
# Find all the children at the current level
set children [list]
foreach parent $parents {
set children [concat $children [glob -nocomplain -type {d r} -path $parent *]]
}
# Normalize the children
set length [llength $children]
for {set i 0} {$i < $length} {incr i} {
lset children $i [string trimright [file join [file normalize [lindex $children $i]] { }]]
}
# Make the list of children unique
set children [lsort -unique $children]
# Find the children that are not duplicates, use them for the next level
set parents [list]
foreach child $children {
if {[lsearch -sorted $directories $child] == -1} {
lappend parents $child
}
}
# Append the next level directories to the complete list
set directories [lsort -unique [concat $directories $parents]]
}
# Get all the files in the passed in directory and all its subdirectories
set result [list]
foreach directory $directories {
set result [concat $result [glob -nocomplain -type {f r} -path $directory -- $pattern]]
}
# Normalize the filenames
set length [llength $result]
for {set i 0} {$i < $length} {incr i} {
lset result $i [file normalize [lindex $result $i]]
}
# Return only unique filenames
return [lsort -unique $result]
}
#5
Same idea as schlenk:
与schlenk相同的想法:
package require Tclx
for_recursive_glob scriptName $basepath *.tcl {
source $scriptName
}
If you only want folderA and folderB and not other folders under $basepath:
如果您只想要folderA和folderB而不是$ basepath下的其他文件夹:
package require Tclx
for_recursive_glob scriptName [list $basepath/folderA $basepath/folderB] *.tcl {
source $scriptName
}
#6
Here is one way:
这是一种方式:
set includes [open "|find $basedir -name \*.tcl -print" r]
while { [gets $includes include] >= 0 } {
source $include
}
close $includes
#7
The answer by Joseph Bui works well except that it skips files in the initial folder.
Joseph Bui的答案很有效,除了它跳过初始文件夹中的文件。
Change:
set directories [list]To:
set directories [list $directory]
to fix
#1
Building on ramanman's reply, heres a routine that tackles the problem using the built in TCL file commands and which works it way down the directory tree recursively.
在ramanman的回复基础上,继承了一个例程,该例程使用内置的TCL文件命令解决问题,并以递归方式沿着目录树工作。
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles { basedir pattern } {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set basedir [string trimright [file join [file normalize $basedir] { }]]
set fileList {}
# Look in the current directory for matching files, -type {f r}
# means ony readable normal files are looked at, -nocomplain stops
# an error being thrown if the returned list is empty
foreach fileName [glob -nocomplain -type {f r} -path $basedir $pattern] {
lappend fileList $fileName
}
# Now look for any sub direcories in the current directory
foreach dirName [glob -nocomplain -type {d r} -path $basedir *] {
# Recusively call the routine on the sub directory and append any
# new files to the results
set subDirList [findFiles $dirName $pattern]
if { [llength $subDirList] > 0 } {
foreach subDirFile $subDirList {
lappend fileList $subDirFile
}
}
}
return $fileList
}
#2
It gets trivial with tcllib on board:
在船上使用tcllib会变得微不足道:
package require fileutil
foreach file [fileutil::findByPattern $basepath *.tcl] {
source $file
}
#3
Perhaps a little more platform independent and using builtins commands instead of piping to a process:
也许更多的平*立并使用内置命令而不是管道进程:
foreach script [glob [file join $basepath folderA *.tcl]] {
source $script
}
Repeat for folderB.
对folderB重复一遍。
If you have more stringent selection criteria, and don't worry about running on any other platforms, using find may be more flexible.
如果您有更严格的选择标准,并且不担心在任何其他平台上运行,使用find可能会更灵活。
#4
Based on a previous answer, this version handles cycles created by symbolic links and in the process eliminates duplicate files due to symbolic links as well.
根据之前的答案,此版本处理由符号链接创建的循环,并在此过程中消除由于符号链接导致的重复文件。
# findFiles
# basedir - the directory to start looking in
# pattern - A pattern, as defined by the glob command, that the files must match
proc findFiles {directory pattern} {
# Fix the directory name, this ensures the directory name is in the
# native format for the platform and contains a final directory seperator
set directory [string trimright [file join [file normalize $directory] { }]]
# Starting with the passed in directory, do a breadth first search for
# subdirectories. Avoid cycles by normalizing all file paths and checking
# for duplicates at each level.
set directories [list]
set parents $directory
while {[llength $parents] > 0} {
# Find all the children at the current level
set children [list]
foreach parent $parents {
set children [concat $children [glob -nocomplain -type {d r} -path $parent *]]
}
# Normalize the children
set length [llength $children]
for {set i 0} {$i < $length} {incr i} {
lset children $i [string trimright [file join [file normalize [lindex $children $i]] { }]]
}
# Make the list of children unique
set children [lsort -unique $children]
# Find the children that are not duplicates, use them for the next level
set parents [list]
foreach child $children {
if {[lsearch -sorted $directories $child] == -1} {
lappend parents $child
}
}
# Append the next level directories to the complete list
set directories [lsort -unique [concat $directories $parents]]
}
# Get all the files in the passed in directory and all its subdirectories
set result [list]
foreach directory $directories {
set result [concat $result [glob -nocomplain -type {f r} -path $directory -- $pattern]]
}
# Normalize the filenames
set length [llength $result]
for {set i 0} {$i < $length} {incr i} {
lset result $i [file normalize [lindex $result $i]]
}
# Return only unique filenames
return [lsort -unique $result]
}
#5
Same idea as schlenk:
与schlenk相同的想法:
package require Tclx
for_recursive_glob scriptName $basepath *.tcl {
source $scriptName
}
If you only want folderA and folderB and not other folders under $basepath:
如果您只想要folderA和folderB而不是$ basepath下的其他文件夹:
package require Tclx
for_recursive_glob scriptName [list $basepath/folderA $basepath/folderB] *.tcl {
source $scriptName
}
#6
Here is one way:
这是一种方式:
set includes [open "|find $basedir -name \*.tcl -print" r]
while { [gets $includes include] >= 0 } {
source $include
}
close $includes
#7
The answer by Joseph Bui works well except that it skips files in the initial folder.
Joseph Bui的答案很有效,除了它跳过初始文件夹中的文件。
Change:
set directories [list]To:
set directories [list $directory]
to fix