So, the stupid question "does my LoRA have Text Encoder layers?" is often answered by using Python and like 3 lines of code. But not everyone has Python nor the will to install it.
That's ok, here is a quick and dirty binary to check that 😅
https://github.com/nArn0/read-st
The binaries are in the release, i compiled it for Windows, Linux and both Intel and Metal MacOSX. All around 2MB 👌
You can check the code if you want to see the uglyness of this quickly made bit of Golang:
package main
import (
"encoding/binary"
"encoding/json"
"errors"
"fmt"
"io"
"os"
"sort"
"github.com/nlpodyssey/safetensors"
"golang.org/x/term"
)
func check(err error) {
if err != nil {
panic(err)
}
}
func waitForEnter() {
// Ugly mess of code to display a prompt and wait for Enter keypress
// without displaying user input if other keys are pressed
oldState, err := term.MakeRaw(int(os.Stdin.Fd()))
check(err)
defer term.Restore(int(os.Stdin.Fd()), oldState)
screen := struct {
io.Reader
io.Writer
}{os.Stdin, os.Stderr}
terminal := term.NewTerminal(screen, "")
_, _ = terminal.ReadPassword("Press 'Enter' to exit...")
}
func main() {
// Check if argument is provided
if len(os.Args) == 1 {
panic(errors.New("no argument provided"))
}
// Open file provided as argument
f, err := os.Open(os.Args[1])
check(err)
defer f.Close()
// Read header size
var l uint64
err = binary.Read(f, binary.LittleEndian, &l)
check(err)
// Read header
header := make([]byte, l)
_, err = f.ReadAt(header, 8)
check(err)
// Get metadata from header
var metadata safetensors.Metadata
err = json.Unmarshal(header, &metadata)
check(err)
// Get tensors infos
tensors := metadata.Tensors()
// Get all tensors names
layers := []string{}
for name, _ := range tensors {
layers = append(layers, name)
}
// Sort tensors names
sort.Strings(layers)
// Print names
for _, name := range layers {
fmt.Println(name)
}
// Wait for Enter keypress
waitForEnter()
}
For usage, it's simple:
read-st-windows-amd64.exe nArnima.safetensors
lora_unet_blocks_0_adaln_modulation_cross_attn_1.alpha
lora_unet_blocks_0_adaln_modulation_cross_attn_1.lora_down.weight
lora_unet_blocks_0_adaln_modulation_cross_attn_1.lora_up.weight
lora_unet_blocks_0_adaln_modulation_cross_attn_2.alpha
lora_unet_blocks_0_adaln_modulation_cross_attn_2.lora_down.weight
lora_unet_blocks_0_adaln_modulation_cross_attn_2.lora_up.weight
(...)
lora_unet_blocks_9_self_attn_k_proj.alpha
lora_unet_blocks_9_self_attn_k_proj.lora_down.weight
lora_unet_blocks_9_self_attn_k_proj.lora_up.weight
lora_unet_blocks_9_self_attn_output_proj.alpha
lora_unet_blocks_9_self_attn_output_proj.lora_down.weight
lora_unet_blocks_9_self_attn_output_proj.lora_up.weight
lora_unet_blocks_9_self_attn_q_proj.alpha
lora_unet_blocks_9_self_attn_q_proj.lora_down.weight
lora_unet_blocks_9_self_attn_q_proj.lora_up.weight
lora_unet_blocks_9_self_attn_v_proj.alpha
lora_unet_blocks_9_self_attn_v_proj.lora_down.weight
lora_unet_blocks_9_self_attn_v_proj.lora_up.weight
Press 'Enter' to exit...
Even opening a terminal is not needed. On Windows, just drag and drop the safetensors file on top of the binary and you'll get what you want 😁
PS: i'll probably do a V2 later to get also the metadata and if this is a LoRA, the dimension. And also, with a bit more of check, for now, better not try it with something else than a safetensors file 😝

