Generate screenshots in batch

Via batch script

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
def bed_to_igv(bed_file, output_folder, output_prefix, flank=0, output_format="svg", thread=1):
"""
Bed to IGV
:param bed_file: str
:param output_folder: str
:param output_prefix: str
:param flank: flanking in bp, default 0
:param output_format: output format, default svg, alt: png
:param thread: int
:return:
"""
flank = int(flank)
fh = open(bed_file, "r")
set_path_command = "snapshotDirectory %s\n" % output_folder
result = []
for line in fh:
items = line.strip().split("\t")
start = int(items[1])-flank
end = int(items[2])+flank
result.append("goto %s:%d-%d\nsnapshot %s:%s-%s.%s\n" % (items[0], start, end,
items[0], items[1],
items[2], output_format))
if thread == 1:
fn = os.path.join(output_folder, output_prefix+".igv")
fnh = open(fn, "w")
result.index(0, set_path_command)
fnh.writelines(result)
fnh.close()
elif thread > 1:
thread_jobs = np.array_split(result, thread)
for i, job in enumerate(thread_jobs):
fn = os.path.join(output_folder, output_prefix + "_%d.igv" % i)
fnh = open(fn, "w")
job = np.insert(job, 0, set_path_command)
fnh.writelines(job)
fnh.close()

Via Websocket

Misc

Add customized annotations to IGV

https://data.broadinstitute.org/igvdata/$$_dataServerRegistry.txt

Always load bed files with indecies

If you don't prepare index of a bed file for IGV, then IGV will try to read every interval into the memory and then generate index for them, which means outstanding consumptions of memories and very a long time for computations. So a good practice for visualizing genomic intervals in IGV is that using tabix to generate index for these interval files first, then load them into IGV. Let's say we have an interval bed file test_file_1.bed.gz, it has 18M records, after loading this file into IGV without index, IGV takes more than 25GB of memory!

But if you use tabix test_file_1.bed.gz to generate the index first, and then feed IGV with the same file, it only takes 2GB!